 Okay, you're live. Okay, great. Well, thank you for everybody for joining today for the Climate Action and Accounting Special Interest Group I just want to kick off this meeting by recognizing that we follow the Linux Foundation anti-trust Policy, so if you want to take a look at that, there's a link to that as well as a hyper ledger code of conduct I want to start off by allowing anybody who is new today to take a quick moment to introduce yourself. We'd love to hear a little bit about what about you here and a little bit about your background and your interest in the SIG and why you're here today. Is there anybody here that was the first time the first time joining the call? So I'm a newbie. I'm Jim Whitestone. I'm the Chief Economist with Convergence Tech. So we're actively trying to develop a range of traceability platforms focused on carbon offset in an in-setting. So I'm keenly interested in sort of how you layer on the measurement of scope through emissions onto that traceability platform. So targeting companies or industries trying to make private claims around net zero or carbon trailing. Fantastic. Glad to have you here. Definitely share the same interest so you're here at the right place. Thank you for joining us. Is there anybody else? I can introduce myself. My name is Josh. I'm with Tracer Dao. It is a platform for creating markets, environmental markets or any markets at that. And we're really keenly interested in this space because I think if you can necessarily financialize but find a good price for some of these different segments then you can create kind of action around it. So we're really interested in some of these markets. And yeah, you're here to learn. Yeah, welcome. Welcome. Okay. Next probably, you know, this is our room. I am part of TSE as well as I work at Walnut and I'm here to learn more about this decision. The decision was interesting. Fantastic. Great. Well, today we are very lucky to have West Geisenberger from, well, he's now with the H. Barr Foundation, where he's a VP of Sustainability and ESG, but he was formerly the director of business development for Hedera Hashgraph. And he's going to be sharing a bit about sharing a bit about open standards today and the advantage of using public ledgers for enhanced transparency and carbon accounting and sharing some real-world use cases as well. And I'm really excited. I've been kind of digging into Hedera a little bit and very interested in kind of like the gossip consensus mechanism and almost blew my brain trying to understand how it actually works. So excited to have him here. I don't know, would you like to West, share your screen or Yeah, I'd be happy to. I'm bringing a presentation today, but please feel free to make it a conversation versus just a presentation. I'm happy to answer any questions and if you want to do maybe a reaction or a raise hand or even just say, hey, I have a question about this, I'm more than happy to stop and pause and answer any questions that I can. Yeah, absolutely West. And I'll be managing the chat. I'll be keeping an eye open for questions. So I'll kind of flag as well if somebody raises a question. Perfect. That sounds great. I'm going to stop sharing my screen and I believe you should be able to share your screen. Perfect. Can everyone see my screen now? Yes, I can. Yes. Fantastic. Well, it's great to meet everyone and thank y'all for taking some time to attend. As Sherwood said, my name is Wes Geisenberger. I help previously help lead business development for Hedera and now as of this week working with the H-Bar Foundation. So rather recently, but looking over all things related to sustainability and ESG as well as a few other areas. And as part of my goals and my focus for the last year, year and a half, I've spent a lot of time in the carbon market space as well as I'll call it adjacent spaces, whether it's conservation or energy markets as well. And looking at different ways that we can create auditable tokens on a public ledger and make sure that every aspect of them is traceable, tied to real world events. And one of the reasons that I think this is important is because of some of the challenges we see today around things in carbon markets, double counting, making sure that we have auditable frameworks that have auditable data and we know how the participants in the ecosystem are working, even though we still want to protect things like privacy information, for example, of individual roles and actors, whether it's a farmer in a carbon offset spaces or protecting the privacy tied to individuals, certifying specific devices, such as, say, an inverter for a home that may be just like a mom or a pop in terms of the energy grid. And I've been trying to find a way to look at this traceability and make sure that it meets the merits of the challenges that we have, which are pretty steep in terms of what the Enhanced Transparency Framework calls for as part of the Paris Agreement. And so we've been doing a lot of research on this and spending time at Hedera on the sustainability markets that are really growing at a pretty rapid pace. And what we've done over that past year is we really started around standards and understanding, well, what in fact, in the case of a carbon offset or a carbon removal, is in fact a carbon removal from a ledger perspective, regardless of if it's hyper ledger, Hedera using our token service, quarter, or any other type of permission to framework or public ledger, really defining, can we say this is in fact a carbon credit. So we've been working with the Newark Alliance, which is now part of the GBBC, in defining that from a token taxonomy perspective and actually created a group as part of the Sustainability Business Working Group called the Voluntary Ecological Markets Task Force. And that task force was really, was really tasked with defining these types of assets from a token perspective, how we create those audit trails, how we create monitoring, reporting and verifiable data tied to those tokens in a way that everyone can speak the same language. And from a Hedera perspective, we've taken that a step further. And this week, we've released the first reference implementation for the Voluntary Ecological Markets Task Forces guidelines and overview for both the token itself, as well as the audit trail. And making it, because it's on a public ledger, fully auditable and discoverable in plain text, using verifiable credentials and decentralized identifiers to describe the actors using verifiable credentials and understand who is acting within the ecosystem using decentralized identifiers to create a chain of auditable linked events that describe those roles and actors and all the data that they create in the mitt is part of a fully auditable framework, both the data level and the framework level for these different assets that we produce. And beyond that, we've not just created the open source framework, but we have a lot of folks adopting that framework. And we will hear about folks like dobu.erf, who's doing that in the Hedera ecosystem for soil organic carbon use cases in the UK. And these different applications, and some of them are grant recipients of Hedera and the HR foundation, go ahead and implement these standards so that we have that fully auditable and discoverable asset. And what's really important is these assets because they're discoverable, you can identify all the specific ESG or sustainability assets on Hedera because of public ledger capabilities and understanding that we can actually create different codes for different carbon credits that are publicly visible and anyone can subscribe and listen to on Hedera, whether it's an enterprise who says, hey, I'm telling the world, here's my emissions and you can go double check my work. Or here's the offsets that I bought to net out those emissions on Hedera and you can even break those down if you know, in terms of I heard someone say, scope three, scope two versus scope one accounting. And how do you even bucket those emissions? There's processes for that that we can use as part of this auditable framework and guardian methodology that we've released this week. And some of these application implementers are also ESG consultants who can help you do that or you can work with to go build yourself. Beyond this, because these assets are auditable and discoverable, we're actually doing net new types of research because in the case of these assets, they're oftentimes unique. I like to compare things like a carbon offset down to the metric ton as a non fungible asset where we have specific geolocation specific methodologies that follows. Maybe we want to understand attributes like durability, additionality, leakage. And specifically for emissions, maybe we want to match those different attributes to a specific geography in a specific time period for the vintage of the asset. So we've been thinking about all these different things over the last year and spending a lot of time to the best understand that. And this comes back to some of the goals of the Paris agreement, right? It's not just about double spend, but making sure we are helping folks measure their baselines from the supply side or understanding the emissions today and the mitigations we can create tomorrow in a publicly auditable way. Helping understand what assets are out there from a discoverability perspective to get to a global carbon price easily auditable and discoverable because the activity you can see on a public network like Kedera. And going back to what we've done with the IWA, we're making sure that we follow standards rules from a technical perspective. So there's clear accounting and reporting rules that are all matched in a reference implementation. So it's easier for folks to track every single role and actor to make sure it's part of an auditable framework. All the data is tied to a specific token and we can really understand the veracity and the validity of that data and ultimately the value of the token in terms of the metric tons captured or the price based on the data around things like additionality, durability, leakage. Hey, Wes. I've got a question from Erin Rosenberg too far. He just asked a question. What are some examples of standard ESG assets besides emissions and carbon offsets? That's a great question. So we have plenty of different types of folks, types of projects building on Kedera. One thing that you'll probably hear a little bit more about maybe it's later this week it'll come out is related to something called the star metric, which is a conservation metric as a different type of asset. We've seen water rights token use cases as a different type of asset, renewable energy credits. Some folks are even using the same framework to trade energy itself and bundle the renewable energy credits with it. So what we're going to talk about today is it's really a policy engine to create different types of assets, not just a single asset like a carbon offset token or a carbon emission token. Great, thank you. Yeah, absolutely. And one of the reasons we've spent, besides this being a really great use case and it's good for different governments, enterprises, or individuals to work on is we've seen this as a really big growing market on Kedera. And it's naturally a DLT focused use case because of the complex systems and carbon accounting. And there's also a growing market here where the market itself has surpassed a billion dollars in carbon offsets today. And we see it growing pretty significantly. I don't think anyone here needs an education there. So I'm going to skip past that. Just based on the folks in the room, but happy to answer any questions around our thoughts there. But the challenges that we really want to focus on is making sure it's so easy that everyone can understand every single actor in the ecosystem and the role that they play, whether they're a certifier or an auditor or a root authority. You can think of the root authorities of everyone from an electric utility company that's maybe state owned or maybe it's regulated as a private utility, but it's under a public regulator. It could be the Veras or gold standards or their equivalents around the world. But we're taking the methodologies that they lay out there and mapping all the permission-based roles and ecosystem requirements for collecting data and putting that auditably on ledger in a workflow style format. And ultimately what we get is we have every single attestation with timestamp and a state proof proving that this event happened this time and that they're testing to that. And if there is an instance where somebody is saying I'm going and registering an asset in three different places, you're going to have the granular proof on Hedera of every single step along the way. And we see large players involved in this. Marley Gray is a good example. He's an IWA chairman in one of his past quotes speaking about the IWA from a Microsoft perspective is they're really spending a lot of time on this because they're trying to find more verified supply of this granular data, whether it's focused on key attributes and metrics. Trying to find a verified asset or verified carbon offset is just there's not a lot of them with this level of granularity that they require. And that's one of the goals that we've gotten out of IWA. And that's ultimately what we're building on top of Hedera. So I'm going to take a quick pause and then I'm going to get into a little bit about Hedera, Hashgraph, and some of the services we use and then go back into kind of the referencing limitation unless anyone has any questions. But I want to take a quick pause here. Okay, with that, I'll talk a little about Hedera and the Hashgraph consensus algorithm. So Hashgraph is a little different than Blockchain. It's a DAG structure for a DLT. The Hashgraph consensus mechanism is not doesn't use blocks or any type of proof of work-based system. It's a highly efficient gossip about gossip consensus protocol. And so what that really means is we have a bunch of different computers on the network, a bunch of different nodes. They're actually run by large governing council numbers, which I'll talk about in a few minutes. Some of those companies include the Googles of the world, IBM, Boeing, Tata, Electricity de France, Standard Bank, amongst others. They're diversified across geographies and across industries. And these nodes that they're running are talking to each other constantly through gossip protocol, which is the fastest way to disseminate information in terms of getting it out there and making sure that we don't have any information that can be disrupted. From a gossip perspective, we attach a few key pieces of information, the last time stamp of the transaction they sent and a hash of it, and the last message that that computer received and a hash of it. And ultimately, when we pass that message and we're gossiping out all of our past transactions, what we're doing is we're building a history of the Hashgraph that any node or any computer on the network can verify. And we go through a process called virtual voting, which is the chief innovation of Dr. Lee Van Baird, who is one of the founders of Hashgraph and Hedera. And this gossip about gossip with virtual voting enables us to have an asynchronous Byzantine fault tolerant consensus mechanism, which allows us to be very fast because our computers don't have to have extra receipts or votes as a consensus mechanism for any type of transaction. What they're able to do is determine based on the history of the Hashgraph and how we talk to each other, they can actually understand, asynchronously, how to come to consensus. And I can share a YouTube video that goes into a little bit more detail on a four-minute version or a 50-minute version as part of the chat in this call if folks are interested in that. But effectively, what we get is a very high throughput distributed ledger out of this. They can do tens of thousands of transactions a second today, and it's public implementation on Hedera. It is asynchronous Byzantine fault tolerant, which is the highest form of security in distributed ledgers. And beyond that, it's also fair, and this is really important for the ESG and sustainability-focused use cases, because when we're taking in tens of thousands of real-world events a second, we have fair access to ledger. You can submit to any node on the network. You have a fair timestamp based on the median timestamp of a consensus message. You have fair order for every single transaction that's determined for the nodes, and you have a fair fee, which we'll talk about a little bit more. It's paid in H-bar, but denominated in USD, and it's very predictable, a penny of a penny or .0001 USD, so you can build cost-predictable businesses, which is a big advantage for the public ledger spaces that we've seen a lot of adoption in enterprises because of this reason. The services that folks are building on top of Hedera, there's two key services that are really used in this architecture. One is the consensus service, which you can think of as a public message queue, where all of the events, maybe a solar inverter saying, hey, I've created X amount of kilowatt hours, or X amount of megawatt hours over a period of time that's all recorded on the consensus service. You can even take your verifiable credentials and decentralize the identifiers following the W3C specification using consensus service to pass those messages through. We also have our token service, which acts as a public unit of account, that we can link all of our consensus messages to. That token service could be things you account for in terms of your carbon emission tokens, metric tons or even sub-metric tons, and the same thing could be said for carbon offsets or our renewable energy credits or any other type of token, and they support the key application use cases for this sustainability ecosystem, so we create a public audit trail from a data integrity perspective for those things like the kilowatt hours over time, or it could be a different step in a certification. Maybe it's a hash of a photo that's used from a data collection perspective of a farm, making sure that farmers planted a cover crop, for example. When we add all of these data integrity events together, though, we still need a public unit of account based on certifying that these actions are true, and that's where our token service comes in, and we can measure those tokens, and decentralized identifiers created through our consensus service and following the W3C spec describe the roles of the individual parties and the act and the certifications that they've given to other things like devices, in the case of an inverter, or an installer of an asset, if it's from, say, a root authority that says, hey, you're okay to go ahead and install this inverter, you're a certified party. That same process can be applied on a carbon emission token side. In creating that workflow to certify that an emission is, in fact, the right calculated amount, and when folks mint emissions or create these tokens to calculate for emissions, they can actually go buy the offsets in the ecosystem, and they can match unique emissions from a single factory to offsets that are created in the ecosystem in a fully auditable way. It's down to a specific device, specific device's data event, and so we can prove time stamp to time stamp and say 15-minute granularity if you wanted to, or even less than that in the case of a renewable energy credit offsetting in real time, and that's all built buildable on Hedera and auditable for all the different end users of the public ledger. So that's kind of Hedera in a very quick nutshell. We could go on for hours on this, but I want to give everyone kind of a quick background on that. And then I also want to talk about the governance of Hedera, because Hedera is a technology, really it's not just the technology itself, but the strong governance and the different actors that are represented across geographies and industries who are taking this technology that they believe is critical for their organization and finding new use cases to change how their business works. And so we have all sorts of different folks, for example, DLA Piper has created a security token offering platform, which we could talk about another time, or we could go into some of the work that Avery Denison is doing in logging supply chain actions through their opna.io platform. And they're really even focused today on doing things around carbon emissions accounting for scope three type events. And so they're looking at Hedera, not just at a builder perspective, but controlling and make sure it meets for their enterprise requirements as they're helping govern the network, making sure the technical roadmap helps them meet feature requests, even supporting things like our open source initiatives like the Guardian, making sure that we think about industry standards or different privacy regulations as an example from a legal perspective, and then also helping to make sure, you know, everything goes smoothly, the fees work right, and the treasury is deployed properly. In terms of one point I always like to make in these types of conversations is a lot of times people talk about the different footprints of public networks and they're very concerned about this and rightly so about the outsize footprint public networks create. Hedera as a ledger because of the efficiency of the Hashgraph consensus algorithm is extremely low footprint DLT or blockchain, but also extremely low footprint from a transaction perspective. UCL, who's one of our governing council members came out with this report comparing a bunch of different public proof of stake networks to each other, even in E2.0's case, as they're switching to proof of stake, and they actually showed that Hedera was the lowest energy per transaction for these public proof of stake networks. I do like to point this out because some folks say, hey, I'm worried about building a public ledger, you know, this is also a lower carbon footprint from a transaction perspective than Visa. And so now to jump into, well, how are we solving this from an implementation perspective? What's in the open source repo that's coming out? How can I use this and what should I be aware of? Hedera is working on industry standards, like I mentioned with the IWA, to describe all these assets and the workflows that we're creating take into account the different modular benefit projects, which you could think about a farm having a series of solar panels that you want to account for using one scientific methodology to account. You could have maybe a soil organic carbon capture following a different methodology and a water rate token tied to a different methodology for all types of different ecological benefits. This can be tied to a single ecological project and have multiple modular benefits to it that we would have different MRV protocols that we would be following. Those methodologies would each have a claim. You could add all sorts of ancillary data that may be valuable, you know, whether it's additionality, durability, leakage, or any other data point that you think is relevant. There's flexibility to add this into the claim and get the right attestation following that scientific methodology, following these standards. On Hedera, you know, this is exciting to me. This is one of the big reasons why we're, you know, why we focused on this area is we're excited about bringing in all these real world events in real time with the fair timestamp, with finality in three to five seconds and paying that predictable fee, whether it's a tenth of a penny to go ahead and mint a token, or a hundredth of a penny, 0.0001 USD for our consensus service transaction to report any type of data in a token workflow. That workflow looks a little bit like this. We have a sensor and there's a there's a certifying action that goes into saying, well, the sensor can emit the data, but there's a sensor that can emit any type of data point, whether it's a kilowatt hour that can be observed by our guardian or decentralized validator that goes ahead and mince a claim or a token depending on, again, the scientific methodology and what it calls for. And that claim or token has a link to all the different events that went into the certification of that sensor, as well as that sensor data itself. And ultimately what happens is when you go to retire that asset or even look at it on the market, you can see every specific action that's been attested to and the entire data audit trail for that. And so this can get really complex. It can have 10 plus parties or 100 plus parties, but for the sake of simplicity, what it really looks like is you could have a root authority. We'll imagine it's a vera today and they could issue a verifiable credential that describes some type of actor. It could be a validation and verification body, any type of certifying authority really, that they can then issue another credential to say maybe a device. It could be anything from a satellite or satellite imagery as one data source to a device like an inverter in the case of a wreck. Or it could be to another party that's one of their partners. And we know exactly who is certifying who, their role, and how they're describing this actor and what that actor is certified to do. Ultimately, until we have a data event that is fully traceable, based on the certification of what they're allowed to do from a description perspective, that can be audited and checked before the minting of a token. Meaning that we create a system of checks and pre-checks to ensure that the supply of a carbon offset is really protected before it's created. That can get extremely complex if you're bringing in three, four, five different sources or maybe a hundred different sources of data for complex measurements or projects across a series of sensors where we even add extra steps around certifying that the sensor is operating correctly. Maybe you have a piece of firmware that's technically audited to make sure it's adhering or behaving properly. You could have different root authorities that say I've certified this sensor company to make these 15 sensors to check to see if a project's bringing in the right data as part of a workflow. And you can bring in manual data or sensor data as part of this, or even third party data using Oracle's or third party systems that ultimately are checked by the Guardian from a data collection perspective and checked to make sure they're following an auditable framework that's been audited from a root authority perspective. Make sure that, say, you're following a methodology that that methodology is still intact and doesn't have any questions about it. That leads to the minting of a token that can be tracked and traced throughout the entire market and allows any buyer to see the inner workings of the history of that token and call into question any data or make decisions based on that information. From my Hedera perspective, it looks a little bit simpler than that, where we're taking a log of data that has the after, in this case, it would be a solar panel that has its own identifier and it's emitting data. And that's linked to its entire set of history as part of that trust chain that we talked about, where you have a certifier who said, hey, this is how this inverter is supposed to operate and that you can even prove that their certification is still good from the root authority every time that a data event is minted. So if you say, hey, I'm auditing you, the audit's good for three months, we're going to have the Guardian validating that every time a data event is created before they go ahead and mint a token, which is ultimately then purchasable through the public network. And what that leads to is a fully auditable and discoverable token, putting those flags out there to make sure that we know that this is in fact, whether it's a wreck or a soil based offset, that the token it's saying it is with all the checks behind it. Before I get into some examples, does anyone have any questions? Yes, Wes, I have a question regarding the root of trust. How did you implement this sort of chaining of the credentials? Because we don't have a Do you put all the credentials of the issuers into the next credentials so that the root of trust could be completed by just taking a look at the last credential? Or do you use registries for this? Or is it a more dynamic approach that you just have a DID, you can resolve the DID and ask for some information in order to make sure that this was the issue of this verified credential and you can go to the next issuer and the chain. How did you implement this? So effectively, you can think of the Guardian as managing for its methodology, managing all the credentials or basically a data registry, if you will, for the credentials within the methodology for the root authority. And so the root authority implements saying, hey, you need these five rules, ten rules, whatever it may be, to go ahead and mint a token. What the Guardian ultimately is, is the policy engine to configure those different decentralized identifiers and the verifiable credentials that are required to create a token following that methodology. And so you're able to look up exactly who's represented by the DID. You're able to understand, okay, in the case of a data generator, we use the solar panel before, so I'll stick with that. For that solar panel, it was given a credential on this date, and as long as that signature is not revoked or has not timed out, it's still good. We can check to see the same for the different installers and we can go ahead and look it up and, based on the rules, we'll be able to see that and query that on a public ledger. And so that's, I don't want to say it's built. It is built in, but the schemas that you create are very flexible within the policy engine for the Guardian. It's probably the short answer. There's a much longer answer to that. Happy to take offline. I have a question, if you don't mind. I heard some background noise there. I have a couple of questions, actually. The first one was, if I owned a project that was creating carbon offsets and they were calculating in real time, let's say burning methane gas, is there a way for me to create an Oracle that puts those out to market and then has a strike price based on what the market price might be for the carbon offsets through the Hedera market at the given time? So Hedera is not actually creating the market itself, but there's plenty of folks who are building markets on top of Hedera. It's probably the easiest way you could absolutely build an application or an Oracle that interacts with some of the markets on Hedera that does that. And I would say that if you wanted to, if you wanted to build policy driven actions, I would suggest looking, and we actually just announced our updated smart contract service, which interacts with HTS tokens. And that updated service will allow you to do a lot of the programmatic actions I think that you described here to interact with those markets. And so it's a great time to be building because that smart contract service, which is based on the Bezu, Hyperledger Bezu EVM, would enable you to do that and leveraging tools that I think a lot of folks in the community would be familiar with. Perfect. And my second question was, is there a way to utilize the decentralized methods here to create any sort of decentralized governance if I were to try to build a Dow, for example? Yeah. So the Guardian is really flexible in terms of what you could create from a governance perspective. So a methodology really could allow you to control anything from a minting event for a token to even a retirement event. So if you say, hey, I want to just, I want to ensure that every minted token that we create as part of our decentralized registry or Dow is only allowed to be burned in the case that it's matched with a carbon emission token from these 15 different sources. For example, if you want to make it that specific, or you could make it a little more generic depending on what level of complexity you wanted to add. In terms of the governance rules though, it's really pretty flexible. The way that we've built out the Guardian, it's in JavaScript, so pretty well known language from a programming perspective. And then that management of the token is really up to the implementer, whether they want to make it prescriptive that just a single Guardian exists, or if they want to interconnect a network of Guardians, that is all up to the implementer. That's perfect. Thank you so much. Absolutely. Any other questions? If it's okay. Yes, sorry, if it's okay. I said a question, if you can hear me, about data privacy or data management. So I understand you have a system network of Guardians and they put a policy in place for creating identifiers for different actors in the system, and policy in place for how data is collected for say an MRV process, or from actual facility. But a lot of the data that's collected, I mean, the end result is to have some sort of public data token registered in the network, but there are a lot of perhaps private IP sensitive data feeds. And there's a way for the system, the Hashgraph, to manage these sort of data public versus private data feeds that respects sort of industrial IP or other legal issues. Absolutely. That's a great question. So the way that the Hedera consensus service works, and really Hedera in general is we allow you to bring your own encryption to the network, and we allow topics to be either encrypted or unencrypted. So the Hedera consensus service, you can almost think of it as a decentralized message queue compared to something like Kafka being a centralized message queue. And if you want to say this topic's permissioned, only certain folks that are approved can submit to that. That's one way to handle things. And you can either encrypt that or unencrypt that you can encrypt part of the message or encrypt the whole message. You can hash part of the data or make some of the data plain text. And so there's a lot of customization that you can do for your specific architecture. I know on the carbon emission token side, you generally want to expose certain levels of granularity of information, say latitude, longitude, maybe too precise for some corporation's comfort, but maybe you want to at least keep it in zip code level. Those are all different things that you can bring. It's part of making sure you're following, of course, the methodologies, prescriptive actions for accounting, but you can bring those different flexibilities to the table from a Hedera consensus service perspective. Thanks. Any other questions before I jump into a few examples? Great. So I'm going to talk about a few real-world applications. I'm going to try to keep it to one on the demand side of the market, the carbon emission side of the market, one on the supply side of the market in terms of carbon offsets, and then talk about maybe how some of the mechanisms of the market may work in the future. And so on the demand side, where those carbon emission tokens exist, and there's actually a specification that the IWA created a little while back on really what is a carbon emission token that you can go ahead and stand up with a guardian. But in the case of Reveley Digital, they've been working with our open source implementer and vision blockchain, who did some great work here in getting this stood up as a piece of open source code that was community driven. And they've been working with Reveley on accounting for emissions tied to greenhouse gas measurement for natural gas fields. And a lot of folks have some concerns about the natural gas industry. And what Reveley is doing is taking a leadership stance. And that leadership stance is providing granular accounting tied to different things like IoT devices, where they're able to report the MRB based data, not so different from the inverter and IoT sensor or example that I gave earlier. But what they're able to do is take that emission event and bring it back to specific IoT sensors or specific sites within their accounting. And what this is ultimately going to do is leads the transparent that transparent role and actor based system where they not only know the auditable framework that they're following, they know every single piece of auditable data down to every single sensor, which has a decentralized identifier and a verifiable credential describing it tying all the way back to that root authority from a certification perspective. What they see is this is the next level of transparency. And then when they go to buy different types of offsets and credits, they're actually able to prove every single location and the type of credit they're buying against the emissions that they're creating to lead to that complete granular view of how they're netting out in their offsetting process. Now, I think this is a really good example, but from a supply side, they could be buying for folks like Dovoo daughter. And Dovoo is working their UK based company, they're working with different farmers across the UK for soil organic carbon sequestration. And this use case is really a much more manual use case. And it shows the flexibility of Guardian where they're working with farmers who are taking additional actions as part of their farms to capture carbon and not only teaching the farmers how to do so, but actually proving that they're doing so in the process. And that additional calculation is tied back to a specific consensus service topic or topic ID where we can store all that documentation and publicly view it. And ultimately that leads the creation of what they're calling a dynamic NFT, which is that unique metric ton that they can view the granular data down to even the sub acre level. And so the topic ID is basically that view where all the data that they can see and make discoverable tied to that token lives and every token that they produce will be visible on ledger tied to that specific set of an accounting metric. And so those are just a couple examples, but really what you get to is that complex workflow that we talked about earlier. We have the entire framework and every piece of data, auditable, using the consensus service and making it discoverable tied to a topic that a company like Dovoo can implement and you can go see. And those topics are easily viewable and they're tied to the token. And that token is down to the metric ton and some folks are even, especially in the energy trading space, are looking at less than that from a megawatt hour perspective, but going into the kilowatt hours. And it's unique. It has all sorts of different attributes that you can add onto it. And what that leads to is a really interesting market interaction where because we're able to account down to the specific metric ton or sub metric ton and with this additional metadata around things like additionality, durability, leakage, vintage, geography, so on and so forth, we're able to see the carbon markets in a much more granular way. And we can almost think about it like how we search for houses today on sites like Zillow or realiter.com. If you were looking for a studio in a specific area, let's say New York City, and you wanted to compare that to a one bedroom versus something else, you would have your different search attributes and you'd see a list of 100 or 1000 or 100,000 places, whatever it may be. In this case, we're going to be able to do the same thing in the future for carbon offsets because we're going to be able to publicly view all of these assets on Hedera and have the full auditability tied to this framework. And the full view of every single attribute that we care about and frankly, let the market decide the price from a metadata perspective and reward those that bring the right pieces of metadata that the market cares about, whether it's the additionality, durability, leakage, or something else that we're not thinking about or maybe is only in a few methodologies today. And we're able to bring that transparency in a way where we're bringing the market together to get to that singular carbon price we talked about at the beginning with clear accounting methodologies at both the framework and the data level. And trying to accomplish that big goal though of a singular carbon price and building new types of market mechanisms. And so that's really where I wanted to end up today and I'm happy to open it up for any questions and I figured I'd share my contact info here and yeah, but I'm happy to take any questions. I was Billy Welch with Tracer Down. We're building derivatives markets for a variety assets including environmental contracts. Question, we're building on arbitrum right now. Within Chainlink, will we be able to take the data that exists on the ledger to build interesting derivative products, whether it's a single price of carbon or more of a localized carbon prices? So I would say you could absolutely take the data and build different oracle products around the assets that are on Hedera. It's totally technically feasible. What I would say is because we're tying the specific audit trail to the token on Hedera, I think it's there may be some interesting markets that can be built on Hedera in a high throughput way that would be completely on ledger but it's really up to the builder at that point on what you could put together. I would say it's a pretty limitless possibility. You can do tens of thousands of transactions a second. There's some nice unique features to Hedera where you can do many to many token transactions, 100 tokens for say X amount of bunchable tokens for stablecoin or H-bar, whatever it may be. So there's a lot of different markets that can be built on Hedera especially as we're adding the programmability with the smart contract 2.0 and the scalable EVM. Hey, would you be willing to take this offline to just discuss further? Yeah, absolutely. Please feel free to ping me. A good way to get a hold of me is Discord. We just stood up a sustainability channel, Hedera.com. And for those who are thinking about building, I do also want to put a plug out there for the H-bar Foundation. The H-bar Foundation is a Hedera ecosystem development org. Hedera recently announced at the time it was, I don't know where it stands today, but at the time it was five billion USD ecosystem development fund in H-bar of course. And that fund is really focused on driving activity as part of the Hedera ecosystem and working with high promise projects that can help change the world. And working on use case like this or any other use case that could be out there whether it's an Oracle-based use case tied to data, bringing the next million offset project onto Hedera or whatever it may be. Oh, very interesting. Thanks. Well, let's see a few other questions. Any one want to chime in and ask anything else? Wes, I had a question, one more question for you regarding the Vera root structure. If I was pulling credits, I'm guessing that the projects will still be registering with Vera as the validator. So then if I were to be retiring credits and using Hedera, could I use the Vera hash that is issued upon retirement to be able to validate that those credits are effectively burned? Yeah. So we have a lot of folks bringing in, I'll call it different methodologies from, I'll call it traditional nonprofit registries. There's a lot of different, I don't want to get too specific here or overstep or say that we are working with Vera or any other registry. There's a lot of folks who are that are in the project space and it's really up to their architecture. There's a lot of flexibility in the Guardian to take those methodologies and digitize them. But it's ultimately up to them to figure out what works with the different registries. Because I know that that's a pretty big area of focus for a lot of projects. Got it. Thank you. Hey, Wes. My name is Gino. I co-founded a startup up here in Alberta, Canada with Neeraj here on this call and we're unlocking emissions trading for micro generators. So right now we're focused on the compliance markets as opposed to voluntary markets and voluntary registries because we're finding that the prices in regulated emissions trading markets are consistently higher and more sort of, I guess, backed by sort of recognized sources or trusted sources than most voluntary registries. What does Hadera's collaborations with any sort of regulated emissions trading schemes or markets look like? So yeah, we work with a lot of different projects both across voluntary and we're also working with some of the compliance space as well. It really depends on what regulator different folks fall under because there's different schemes in every single country for the most part or region in the case of like the EU. We're pretty flexible. This is a piece of open source infrastructure that we see folks using in both spaces. We actually do see a couple of projects, too, trying to span both voluntary and compliance markets where they're producing what they call compliance grade credits, particularly in the rec space. This is popular compliance grade credits that can both sell involuntary as well as compliance markets. Are there any specific compliance markets you'd like to mention or? Nothing that I can mention right at this moment, so. Thanks. Appreciate it. Any other questions? If nothing else, I'm more than happy to connect offline. Please feel free to reach out to me if you don't have a Discord account. I am out there on LinkedIn, just as Wes Geisenberger also out there on Twitter, Geisenberger or Wes. And yeah, more than happy to continue the conversation if you want to learn more about the H-Park Foundation or Hadera. I'm always happy to have that discussion and thank you all for having me. I really appreciate it. Absolutely. Thank you so much, Wes. It's been a really, really interesting conversation. I just want to also flag for next week, our next meeting on November 2nd, we have another great presenter. They've all done from Moja Global to be around open source MRB software for forestry, agriculture, and land uses. So please join us then. And if you don't have any more questions, I guess we'll adjourn the call. I see one actually from Elizabeth Green. What proof of reliability, authority? Elizabeth, do you want to actually ask that question? I'm not sure if I understand exactly what the question is. So what's proof of authority, reliability, and stake to the 39 members of the DAO maintain? And how do you ascertain that they're maintaining their proof of stake, et cetera? So Hadera's governance council, they're actually, which I think is kind of where the question's going. So the Hadera governance council is made up of large fortune 500 type actors and large highly reputable universities across the industry and geography. They each run a node of the network, which runs the hashgraph consensus algorithm. And you can see all those companies. I'm going to put it here in the chat at hadera.com slash council. And what they're doing, you can check every single node, obviously, make sure it's running that today. We actually have a hadera.docs.com. You could even look at the node spec yourself to kind of learn a little bit more about it if you want to get into some more technical depth. But what the council members are doing is we publish transparent. And I say we, as I was part of the team very recently, but the Hadera council really publishes all their governance aspects in a transparent way, meeting minutes, everything else that goes with that. But from a technical perspective, the codes open source, you can actually go look at it. It's out there on GitHub. And when it comes in terms of reliability, it's public networks. So you can, we have a status.hadera.com. And you can see the entire history of all the reliability of the network itself in terms of authority. Each of the companies has their own node and votes with different interesting signing tools, which Hadera has talked about publicly in blogs. Probably don't want to get into the details of that today for this webinar. But if you follow from the offline, I'm happy to share the blog post that described that. And then Hedera is a proof of stake network. Today, from a stake perspective, the stake is weighted equally. And in the future, and we have a video series called the path to decentralization, which is out there on the Hadera YouTube, that actually describes how that stake works and it'll change over time. There's another question. West from Kyle Robinson, how much this ecosystem has been built in and implemented in production to date? I don't know if you need to be more specific. I feel like you described several different ecosystems that are running on the platform. But yeah, we have a bunch of different projects. Some of them will become more public over the coming weeks. I would put it in, we'll have a few things out there even later today that are more public in addition to what we share with DOVU and Reveley Digital. In terms of, I think one of the things that I'm really looking forward to, and I have a Hadera Sustainability Ecosystem GitHub page for folks who want to learn more about what's going on in the ecosystem specifically. And I'm going to just drop that in the chat here to help answer that. But I'm looking to see the explorers that are built specifically around these types of discoverable assets. And I think that'll really show, in the future, it's one of the things I'm hoping people apply to the basebar foundation with is new ideas of how to demonstrate this. Because I think that as we mature across the industry, showing all these assets in one place will be a really good opportunity for someone to take advantage of. But today, it's out there on the public ledger, you can kind of see, even on the test net, I think there was multiple wrecks created today that I saw just kind of watching the streams go through. Hey, I have a question as well. Thanks, Wes. Really interesting. I'm reading that there is some partnership work with IBM Blockchain. How does Hadera think about interoperability with traditional blockchains? Yeah, so Hadera is very, very active in this space. And I'm going to go try to grab this quickly on the fly. There's an open source decentralized bridge architecture that's out there on GitHub that some of the folks in our ecosystem have built. I think it's pretty interesting. I know there's some folks implementing that today. And so I would keep an eye out for activity around this in the future. But in terms of interoperability for, say, fabric is a good example, there's actually a Hadera consensus service plugin. So if you ever think about tying something like fabric or any other hyper ledger framework to a public network using consensus plug-in, then you can plug in the consensus service as that consensus algorithm to get further public audit ability for ordering of hyper ledger networks or hyper ledger-based systems. And if you have really sensitive data and you're just concerned, maybe you're taking a carbon emission and you don't, you aren't quite ready yet for a public ledger. If you wanted that fairness and high throughput ordering from the Hadera network, you could plug in the HCS plugin that is out on the Hashgraph GitHub site under the same area as the Guardian itself. And you could go plug that into fabric. Now, what you get out of that is you're actually leveraging the same global timestamp so that fair order across multiple networks. So if you have a permission framework like fabric, you can reference the timestamp specifically tied to Hadera for a transaction happening in that private framework in relation to what's going on in the public network, say, with a carbon offset, as an example. I do, from an implementation perspective, I do think it's very valuable for folks to be out there in a public network showing every step of the way to give that enhanced transparency. But for folks who aren't quite there yet, that is one option is to relate those interactions together using the same global timestamp. Thank you. Okay. Well, I don't see the additional questions. And so last chance, if anybody has a final question for us. So there we go. How many node operators can the system accommodate? I'm not sure if that's directed at Hadera or directed at the Guardian itself. So I'm going to try to address both and hopefully that'll help. From a Hadera perspective, Hadera is currently at 39 governing council seats. However, we've talked about opening that up to community nodes. And I think there's a lot of public conversation around that over time and ultimately becoming permissionless. And there's an entire webinar series called the Path of Permissionless there. There's no technical limit on the amount of nodes that we're looking into. So that's kind of the Hadera level on the main net. From a Guardian or Guardian network perspective, that's really up to the implementer. Right now, we see a lot of folks just doing a single Guardian because they want to get this set up for their specific projects and adding in different parties to that process who want to run it. At the same time, we know that there is, especially in the ESG space, there's a lot of folks who aren't comfortable running their own nodes from whether it's a corporation or somebody else. And that gives the flexibility for say that registry or root authority type body or even a separate body who implements the root authority's methodology to stand that up and allow them to interact still in a publicly auditable way, publicly verifiable way and discoverable way with the Guardian network that's being created or the single Guardian that's being created. This is unlimited? In theory, there's, yeah, there's no one's tested, no one's tested unlimited, but I'm sure you could, you could, you could do as many as you you'd like to do. And it's just working, working through the scaling of that, based on, of course, the complexity of the rules that you create for the MRB. One unique technical feature for Hadera is we can create multi-signature tokens and you can have multi-signature accounts. And so when you create a signing event in DERA, you could embed private keys into every Guardian and prove the signatures and you could do things that way. That's, that's one of the cool features of the Hadera network. We didn't really get to discuss too much today, but yeah, there's a lot of, a lot of different features that we've maybe buried under the technical covers, but you can absolutely, absolutely use that are a little bit different on Hadera. Thank you. Absolutely. Okay. Well, it looks like we're running up to the top of the hour West. Again, thank you so much for this great presentation. You've given us all a lot to think about. This is a really great presentation. We really enjoyed it. So thank you. Absolutely. Thank you so much for having me. I really appreciate it at the time and it's great to talk to you all. Please feel free to reach out and ask any questions. I'm happy to follow up there. Absolutely. Thank you. Have a good rest of the day.