 Okay, share. Okay, and you guys, oh, is that the full screen? I don't want to do that. You said my desktop behind it. A little bit of your desktop, yeah. I'll just have to keep it, okay, whatever. It's fine. Well, good morning, everybody. Welcome to the August 22nd meeting of the Climate Action and Accounting Specialist Group. Today, I'm excited that we've got a couple really great presenters, Heather Akinhosen. Heather, can you help me with the pronunciation of your last name? I don't want to get that wrong. You nailed it, Akinhosen. Yeah, you get gold stars. All right, thank you. And then Truman Seamings will think we'll be joining us shortly, but we're gonna hear from them on updates from OS Climate and all the work that they're doing. I'm really excited about this because you guys are, so we have put, just so many different programs going on and so much interesting work, especially related to the work that we do. I just heard something, somebody rung a doorbell, David. I don't know what that means. Oh, sorry, I can mute it. I think just when people join, it's doing, it's giving it a little buzzer. I'll turn that off. Okay. So for those of you who are joining for the first time, I also want to flag that hyperledger follows an antitrust policy. Really important to notice, we also have a hyperledger code of conduct that we follow. But with that, I think I'll just kind of hand over because I wanna have make sure that we have plenty of time to kind of hear from the folks that always climb and hide. Truman, it looks like you are in something from Middle Earth right now, like an Elvin, if you look beautiful where you are right now. But yeah, I'm in an old southern hotel. Okay, beautiful, beautiful, that makes sense. What part of the South are you in right now? I am on the Virginia, West Virginia border at a place called the Homestead. Oh, the Homestead. Yeah. Yeah, my parents live in Lynchburg, Virginia. They just had their honeymoon, they're not their honeymoon, they're celebrating their anniversary there. It's beautiful there. Oh, terrific. Yeah. Okay, great. Well, I'll kind of hand it over. I'm really excited to hear from you guys. I wanna leave enough time for questions at the end. And so Heather or Truman, would you like me to share the screener or anything like that? Just let me know or just ignore it. So Heather is the main attraction here for all kinds of reasons today. I'm just gonna give a brief introduction without slides. I don't know, Heather, if you wanna pull up a cover slide now or wait until you start. Yeah, I think I need Sherwood to stop sharing. I'm getting an error. Awesome, thanks. Terrific. So just to kick off, thanks so much to David and Sherwood. We really appreciate the invitation. And this is a really, really exciting and important part of the Linux Foundation community. We're really indebted to participants, including Tom Bauman and others who really from the very early days of OSCLEMENT have been supportive of what we're doing. And so really pleased to be able to give you guys an update here and for those on the call and those that watch the recording, one of the things that we're eager to do through this update to the SIG is to identify opportunities for collaboration to welcome contributors to our current projects, to welcome also potentially those that wanna host projects that seem like a good fit with OSCLEMENT as opposed to other parts of the Linux Foundation community. So I guess just the high level framing before handing it over to Heather to talk about our programs is that we started OSCLEMENT five years ago to try to tap the power of open source, particularly to address the massive, a multi-trillion, five to seven trillion plus annual gap in investment flows needed to finance the transition to of the global economy to be able to meet Paris goals and not only from a transition perspective, but also from the standpoint of social welfare and protecting the natural environment from impacts. So Heather will talk through some aspects of this problem, but we started out narrowing down from sort of a broad focus of everything around data and analytics for climate-aligned finance investing business and policy-making to focus on particularly the needs and opportunities of the financial sector or alignment and for risk management. So we started out working with the particularly large asset owners, pension funds and insurance companies on the asset side and with banks together with also technology companies and then service providers in the space like EY to develop the projects that Heather's going to describe. So we've just made the next big step out in terms of the types of corporate members that we are bringing into the community. With Capgemini, now you would think, they're an integrator like an Accenture, but one of the key things there is that the project that Heather's going to describe is all about helping large corporations to reach the levels of certainty in data and insights from analytics that they would need to make really, really large scale technology platform investments to move to a low carbon future. And so along with that then we'll be looking to bring in corporations, real economy corporations from the auto sector, hard to abate sectors, steel, oil and gas energy, cement, et cetera, and others including all the way through ag and consumer goods and the like. So let me just hand over then to Heather to tell you more about the particulars of the program and the projects. Great, thanks Truman. Yeah, just at the highest level, just kind of piggyback off of what Truman said, what OS climate is trying to solve in terms of problems is making sure that, again, financial institutions, governments, nonprofits, real economy companies have the data they need to make decisions that are gonna move not only financial investments, but move the world to a low carbon or no carbon and net zero world. And the two problems that we've been focused on are really looking at making sure the data that goes into the decisions that need to be made is accurate, it's high quality, it's trustworthy, that people can have confidence in that data. So there's two main pieces to OS climate's platform. There's what we call our data mesh, which is this big box in the middle, which was architected by Red Hat. And the idea behind this data mesh is it's a federated system. So we're not copying all the data on the planet, but we are connecting to all the data on the planet in theory, right? That's our longer term goal and these different types of data sources and making them available for people to ingest into the analytical tools. Like I said, for assessing and understanding what's the risk that they have from a climate perspective and how best to transition. And are they getting their emissions down to a point where we all can have confidence that our world's gonna hold together as opposed to some of the nearly frightening climate changes that we're seeing just this summer. But the federated data mesh is the underlying layer that uses, like I said, some of the latest technologies that are out there to make sure that we can do different things on the data to have, like I said, the trustworthiness. So I'll go through those in more detail, but just wanted to kind of share this at a really high level that if there's a data piece and then there's a tools piece because there's a lots of parts to OS climate. So just diving into the data commons, the data mesh. Like I mentioned, the core goal is to make sure that we start to manage data as code. And this is what's kind of an innovative architecture that Red Hat has brought to the program. So making sure that the data can be available to people at all levels so that you don't necessarily have to be a technical person. We're building something called the data exchange that allows people to search for the data, understand where the data is, be able to select data. And I'll share a few slides about what that looks like, but really use almost like a shopping cart model so that you can pull the dataset you need, maybe a model that you would like and pull that all into a tool very seamlessly. The other piece, like I said, is kind of what we call the pipes. So you've got the exchange, which provides the access, but underlying that are the pipes that allows us to manage data as code and really try to improve the quality and the audibility but also ensure there's transparency. So one of the key problems and challenges with making decisions today is you don't know where all the data came from or how it evolved over time or how data was derived from other datasets. And like I said, this data mesh architecture tracks every change and versions all the data. And you can see all the historical data lineage if you will of what happened, by whom, when and all that type of thing. So that really drives the reliability and the ability to do comparisons more easily. And part of the data mesh is also having what we call open metadata. And that makes sure that there's consistency across the data sources and that you have information about the date and the time and the version and the source and all of that. Like I said, our goal is to make sure that all that is available for our users. So a little bit more on some of the data mesh challenges. Like I said, the architecture and the components are really focused on addressing the pain points associated not only managing with large complex climate datasets, but data in general. So this is an extensible platform. The data mesh is a really good use case for climate datasets, but it could be used in the medical areas. It could be used in different industries. It's not exclusive to climate data. But with the Federation, like I'd mentioned earlier, we want to connect to data sources. We don't wanna copy data because copying data tends to propagate this issue of versioning and integrity. And one of the key concepts that we focus on is that data is owned. And when you have data owners, that helps preserve the integrity. So when we pull data from the UNFCCC or a source like risk thinking, as those data sources are updated, the data mesh will ensure that those updates are automatically triggered and tracked for users. And like I mentioned earlier, data as code is this new concept that really allows us to solve some of the problems where the data can be compared and then there's a lot more accurate and trustworthy. The other thing about our open source, and all of you know this, is open source really affords a level of transparency that's really critical, especially when there's a lot of, we'll call it climate science deniers, being able to have that open source capability, being able to reproduce the results, being able to look at how things have changed over time, drives that sense of confidence in the data that's really important. And then obviously as an open source organization, the data mesh also affords collaboration and efficiency, where like I said earlier, data scientists can really be able to share their results, they can work concurrently and they'll be able to simplify their knowledge sharing and be able to replicate results which really can help produce or boost productivity for their work as well. And then for future-proofing and ease of maintenance, this notion of managing data as code and using like software pipelines against data pipelines, also lowers the cost of ownership in terms of maintaining and updating and modifying as we're carrying which change. So ultimately, just kind of in a one-liner, right? The data mesh is really all about creating a climate data product factory that's repeatable, where we have templates that our users can modify for their own purposes and leverage for their own data sets and for their own goals and use cases. I'm not gonna go into a lot of detail into this, but this kind of shares a little bit about the, this notion of managing data ingestion pipeline as code. It talks a little bit about the tools and the technologies that we're using. Like I said, I won't go into a lot of details if you're interested. We can have a follow-up on those technologies, but like I said, there's some of the latest and greatest software open source and all of its open source technologies that we're using, Trino and Packarderm and DBT and Open Metadata to just name a few. All right. And just as kind of an example, so RIS Thinking is one of our partners and they offered us close to well over 17,000 data sets both forward-looking climate data sets as well as economic data sets as well as some asset data sets for free. And again, with this data mesh and this federation capability instead of copying all those data sets to our cloud infrastructure and incurring those costs of storage and things like that, we can connect to those data sets through our data mesh using a Trino-based connector and all their data is on the Google Cloud. And it allows us to make sure that we've got the latest version because as they make updates to those data at the source, users can be notified. And again, it's one example, but this applies to any data set that we would connect to. All right, moving on to the data exchange and feel free to interrupt me if you have any questions but the data exchange sits on top, if you will, of the data mesh and it's our user interface that's really focused on making the data easy to find, consume, share and trust. And this kind of lays out a bit of a pipeline of our approach and how we decided to build this but I think it's best just probably to hop to a use case. So in the physical risk which is one of the tools that I'll talk about here in a minute, one of the goals with physical risk is for people to understand if their assets are at risk for a specific peril it might be a drought or a flood or maybe a heat event. And the ability for someone who doesn't have a technical expertise to be able to go into the data exchange is our goal so that they can not only pull out the model that they might be interested in but make sure they have all the data sets that they require. So we've created kind of a shopping cart functionality that we're doing a proof of concept on right now where they'll be able to pull in, hey, I'm interested in drought. I'm interested in this particular area of the world. Maybe it's a group of cities or maybe all their assets are in one city and they can bring their own data. Maybe it's a real estate portfolio that they have as part of their organization and be able to pull all those ingredients, if you will from the kitchen and be able to perform the assessment using our physical risk and resilience tool. And again, the idea was they can get the data they need, use our tools or conversely, they can download the data and the bottles for their own use in their own bespoke environments that they have those as well. The data commons data mesh, there's also a couple of sub work streams that we have at OS climate that are also working together to, again, improve the data that we have as well as the quality of the data that we have. One is data extraction, which looks at sustainability reports and ESG reports from corporations. There was a contribution made by S&P Global of 75,000 corporate sustainability reports and what we're using this tool to do is using machine learning and rules-based engines. We take those PDF files, which are like unstructured reports and go through that and try to extract key metrics like emissions data and their net-general plans and their trajectories and things like that so that that data set is available for users as well as can be funneled into our sector alignment and physical risk tools as needed. And then also we have what we call entity matching and, again, entity managing really tries to make meaning out of some of the data. So a lot of the data sources that we get don't necessarily provide real specifics about company entities. You might just get a company name. You might not be able to link it to information maybe that's kept in the SEC or something like that. So entity matching allows us to start to match company names with legal entity identifiers and LEIs and uses a number of sources like life and things like that so that we can start to map. Who's the parent company? Who's the subsidiary company and that type of thing so that especially when you're working with financial institutions, being able to basically know which companies you have data for and how that data is being used and how it rolls up under a parent is important to understand the overall picture of how they're moving towards their net zero emissions. So that's kind of all the data pieces in a nutshell. I'm gonna start to talk about each of the analytical tools. Any questions before we hop into the analytical tools that OS climate's building? Great, so let's start. I just wanted to perhaps state the obvious before Heather begins this next section. And that is we are very eager for consultants working with financial institutions and real economy companies and policymakers that are in this dig to hear what Heather's gonna present and has presented about the data from with an eye to potentially leveraging these tools in serving clients. One of our core theories of change and the way that we'll have impact is not gonna be scaled by just the use by our members but really depends on the consultants large and small whether it's EY and BCGs seem to become a new member or small consultancies, we really encourage that and certainly the project teams are available to onboard consultants that want to understand and get familiar with the things. Thanks Heather, go ahead. Great, thanks Truman. Yeah, so now let's talk a little bit about the tools that we have. There's three primary ones that I'm gonna go over and obviously if you're interested in any of these feel free to reach out because I'm just gonna kind of touch on some of the key highlights about what they do and what the focus is. So the sector alignment tool really looks at implied temperature rise and helps investors align their portfolios with a 1.5 degree pathway. And again, the hope is that we can help drive emissions down to zero, but this isn't a way to assess where a company's at. A lot of companies have made statements about where they wanna hit a net zero, whether it's 2050 but they may not necessarily have a well defined plan to get there. And so this tool lays out kind of all the steps in calculating a temperature score. We use a couple of different methodologies. The one earth climate model as well as TPI and basically you can take a look at different scenarios and see the results. And it allows us to kind of have an independent view on where companies are relative to reducing emissions over time. Some of the key use cases, I won't go into these in a lot of detail but like asset owners, asset managers, banks, I think they're all right focused on net zero pathways and monitoring progress towards net zero goals. There's a multi-step process in the sector alignment tool and how the calculation occurs to get down to a final implied temperature score by company. And the way we take a look at it is obviously a company may have made a commitment but we take a look at where they're at and how much progress they've made and if they're on target or not on target. So, and then we wait what they say versus what like I said, where they're at currently and provide kind of a weighted average but some of the data sources are listed here that go into the implied temperature rise but we take a look at data and the different benchmarks and budgets that are available. We pull in, like I mentioned that some of that entity matching information and the glyph sources, the tool is pretty robust and that it can handle exchange rates and currencies to calculate a lot of the different information but the ITR tool really has provided and uncovered a lot of challenges with the missing data, how estimates are produced. Michael Thiemann, who's always climate project lead has been instrumental in building a lot of this tool and he's identified issues with the methodologies and assumptions made with some of the models and has provided that feedback. So it's been a really great way using open source to improve the accuracy of some of these methodologies which is pretty exciting. This is what the tool looks like. You can see here, this is a portfolio of different companies, these dots, and I know it's probably a little tiny. So like I said, if you're interested, we can do a deep dive on any one of the tools that you're interested in but this is calculating, the overall implied temperature score of an entire portfolio with a wide variety of different groups of companies, electrical companies, real estate companies, different things like that. And like I said, we take a look at different scenarios. This one is using SBTI's methodology. So you can see that same portfolio looks a little different when you use their methodology versus the other methodology and the temperature score is showing as 1.98. You have the opportunity to look at different scopes, one, two, and three altogether versus one and two. And then looking at sectors individually. So this is our group of electric utilities and you can see a lot of them are above the line which is not a good thing. Where the line obviously is the 1.5 degree line, they're well above that. And then looking just specifically at gas and their plans. So just a quick, so that's just a quick, whistle-top tour of the sector alignment tool. I'll jump over to physical risk and resilience. And this group of tools is really focused on creating what we call like a plug and play set of hazard models as well as vulnerability curves so that there's no one guaranteed prediction of the future. And what I love about the physical risk and resilience tool is it allows you to like I said, plug and play different models, see what the results are so that you get a better trajectory of what the probability of impact will be. And in many respects, there's a lot of different use cases that physical risk and resilience can support. A lot of regulatory agencies are talking about disclosures and disclosing risk. So that's an obvious one. Doing the scenario analysis and measuring the risk to a portfolio of assets and or companies is another use case understanding whether you want to make a loan to the next real estate developer who's looking to invest in building properties in Florida or some other area that's at high risk, maybe for flood inundation or coastal inundation or whatever is another key use case. BNPP leads this work stream. And so they definitely are looking at a lot of these use cases but obviously origination of loans is one of the ones that's a big priority. And then obviously operational risk, if you're a company and you just wanna know what your risk is to assets within your own portfolio of assets, telecommunications, what's my risk to my cell towers or what have you, I think there's like I said, a lot of use cases for the physical risk and resilience. And BNPP has actually already started to use the toolkits internally to assess their risk on their lending portfolios. And we found a lot of interests, particularly in this project work stream from academic organizations as well as other companies and consulting firms who are actively participating to contribute to the physical risk tool. Just at a high level, the tool pulls in a variety of hazards, think drought, again, flooding, heat as well as other perils. We have a contribution from Jupiter that is a pretty broad solution that provides data for essentially all the hazards. The granularity is limited. However, it's a great indicator for first-time users to see if they have risks that they're exposed to. And then obviously different assets have different vulnerabilities. So a key piece of the physical risk and resilience tool is to look at the vulnerability profile of different assets, right? A building will have a different vulnerability profile than say a dam or some other type of asset. And then what's the risk? What's the financial risk? What's the operational risk? What's the revenue potential loss and damage and all those types of things are also part of the physical risk tool so that effectively you'll see something like this that comes out of the tool where it has an impact summary of the probability and of, again, various perils and what's the probability of having an impact on those set of assets. And this is just the front end of the tool. So you can see in this particular case, it's dimmest showing chronic heat and mean work loss and it's under a particular model, which in this one it's the SSP 585 model and for a given timeframe. And so this is a recent work that the team had done to take a look at at what point does it become too hot to work and what's the impact on an organization in terms of loss of productivity and people being unable to work? I would say I mentioned the plug and play. So let me go to the next screen here. But again, what we have found that we've got this sandbox UI, which we can give you access to if you're interested so you can play around with the tool and really drill down in certain areas in terms of what assets you can upload a series of assets and see what the impact is and the sandbox UI has some examples in it so you can see what the modeling looks like and how you can drill down into each asset and look at specific impacts. So this particular one highlights the fact that it's exposed to coastal flood. Just another just key call out is while we started down a path of focusing on financial community and the financial institutions and banks and asset managers, owners, et cetera, the physical risk and resilience as well as some of the other tools have public good use cases. So we have recently partnered with the Sustainable Africa Consortium Initiative to look at providing not only OS climate's data and the physical risk and resilience tool to the continent of Africa, but as well as creating a dedicated platform for them. So there's work underway, there's a challenge that we're working on with the Nigeria country to focus on universities. We've started reaching out to 10 universities in Nigeria and they're gonna be kicking off in the first of September around a challenge that looks at the agriculture sector and how they can determine what are the physical risks and come up with resilience plans for the ag sector there in Nigeria. We're partnering also with the Tony Illumila Foundation to reach out to entrepreneurs in Africa as well across all the different countries across the continent to do capacity building and help train them and share the tools and the resources with them as well. So they have the latest and greatest technologies so they can come up with solutions to serve their areas. So that's pretty exciting. And there's a lot of opportunities that OS climate's looking at relative to grants and what I would call philanthropy use cases across the global south. Any questions on that before we go to transition analysis? I just have a kind of a broader question. First I wanna say just congratulate you guys on the scope of the challenges that you're trying to solve for or challenges and what you've built. I think it's really fantastic. I'm curious with the solutions that you've described so far where are you seeing the strongest current use case or demand for use of the platform? And how do you anticipate that changing and evolving? I guess from my perspective I'll let Truman chime in. Physical risk and resilience definitely has what I would call the most active group of members and across a broad community. We've got academic institutions participating. We've got like I said, some of the larger members like BNPP as well as folks from like UNPRI all kind of contributing to that work stream. LSEG is active. So that's the one I think that people resonate most with because you can actually physically see these risks in real time that are happening and playing out across the world. So I think there's also a sense of urgency. I see at least kind of as someone that sees kind of all of the different work streams and how they progress but that one definitely has momentum and has legs and obviously there's interest in the nonprofit use cases as well like the Sustainable Africa Initiative. But Truman anything you'd add to that? Sure. Yeah. So agreed, Heather. I think we're going to, we're hoping that we're gonna see a significant uptick in the use of the sector alignment tool for portfolio alignment as that tool gets to the point where it is undergoes an next round of testing and very importantly is made more accessible to non-technical analysts because in a lot of cases the analysts and portfolio managers that are making decisions around realigning investment portfolios to net zero goals don't have a GitHub account. And so when we presented this to the net zero asset in our alliance, which is a group of about now I think 40 plus pension funds and insurance companies on the asset side with combined 10 trillion in assets were re-produced and some of those are going to be involved in the testing. And so we'll see what that uptake will look like. Oh Truman you're breaking up. Okay, four and then what Heather's talking about enough so that I can continue? Yeah, you're just getting a little bit garbled there Truman so it's hard to hear you towards the end. Okay, is it clear now? Clear? Yeah, that's better. Okay, great. So I just the final point there is we should be able to measure the uptake by the large asset or community in the fall Q4. Okay, yeah you're starting to fade out again. I think you said in Q4 we hope that community will accelerate in participation. And so the only other thing Sherrick is the transition analysis. I've definitely seen a bit more interest in that work stream and I think there's a good connectivity right in the physical risk feeding the transition analysis which I'll talk about here in a minute but that's another one that I think sometimes there's a quote about ideas being a bit ahead of their time and I think some of the transition analysis work has been a bit ahead of the time. And now that people are starting to truly understand physical risk impact now they're kind of like, okay, how do we start to think more holistically about transition? At least that's just my two cents but there's a lot of words on this page but just at a really high level the transition analysis tool is composed of two parts. You'll hear us use the term witness which is really the framework and the model itself and then there's something that we call the optimization engine or the UI that front ends that model which is something they call SOS trades and SOS just stands for system of system but the framework or the UI, if you will, allows again, similar to the physical risk and resilience you to plug and play, assess different multiple scenarios, analyze trade-offs and business cases and really look to optimize investments against objectives and the whole goal of transition and originally this tool was a contribution from Airbus and when you think about Airbus their whole business model has to change, right? How they, what's the fuel that now powers the planes, the supply chain has to look completely different and so for them, right, they're making a break the business decisions of how they have to transition over the next couple of decades and so the transition analysis tool and these are some of the use cases but it helps ask and answer questions around, what's the expected GDP and the employment and the energy mix in a particular region going forward and how could shocks affect the pace of decarbonization and it's a pretty complex model and I'll pull it up here in a second but it takes a look at policy and population and macroeconomic factors and how much copper is left in the earth and can you move to all batteries if there's a finite amount of copper left in the world so looking at different, these different factors and look at different trade-offs you can again run up onto different scenarios and identify how best to kind of lead your company into the future. Again, I'm not gonna go through all the different use cases because I know we only have a few minutes left but this is what the model looks like and like I mentioned earlier, it looks at policy changes and how that might impact transition, it looks at the population, how many people will be employed and scary things like at certain temperature rates, death rate may go up if people are exposed to heat, deaths may increase and different things like that but it looks at capital and productivity and all the different macroeconomic factors that are out there, energy demand and energy use and the different types of energy, it's very robust in how many different energy types it models from biofuels to solar, to wind, all of them that are out there it also looks at investment in R&D areas and I use the copper example, again, material consumption, how much is left and how much natural resources are available. So all of those factors play into the witness model, it's pretty complex. And again, if you're interested in this one or any one in the community is interested we certainly can do a deep dive on the witness model as well. And this is just an example of an optimization where they wanna maximize welfare but minimize CO2 emissions and looking at different design variables in terms of how they would invest in technology, would they use different types of liquid fuels or solid fuels or like I mentioned there's a lot of models that are built into the tool and it's super popular. Yeah, not only complex but it definitely is very robust in the modeling that it does and how it looks at population and land demand and things like that. So any questions on transition analysis? Before we move on to the last couple of tools that I'll just mention. I had one question. Sure. You don't have to go back to the slide and one of those slides had a bunch of pull-up points and one of them on there was related to investment strategy. So just to clarify if I'm a pension fund or a big bank or something like that in my shareholders or my pensioners want me to get a 4% return on our investment and something like rental gas gives eight and renewables gives two or three it will show that costs of moving investment back and forth. No, I'm gonna jump in here. So... I think it's the pull-up. You have to go all the way back there was some bullets about maybe I read it wrong your first slide with bullets the very first one where it looked like it was helping... Yeah, the first pull-up point. Is that a strategy tool essentially for investment? It's not, so at this stage so what this would be is an input. So for one thing, the model is global at present and so the key, the most important parts of the roadmap over the next several quarters are a, regionalization, b, sectoral specification and more detail around that. While the models will ultimately get down into the microeconomics of a particular firm actually I'll come back, that gets to Sherwood's earlier question at present this will be really more of an input used in checking market sizing and these kinds of things for different types of technologies. So that's, there would not be any sort of return on an investment type of an output for the public versions of the model. Now, one other thing that kind of getting back to Sherwood's question earlier about use and how does this scale? So one of the really exciting things about the project for OS climate and kind of realizing our theory of change by having the open source projects then taken out and used in consulting with financial services companies and real economy companies is that Capgemini is about to launch at New York climate week, the week the 18th of September, a new product and service offering where they will extend the open source model into the particular economics and markets and specifics of a particular company. And so they're engaging with the auto sector and large emitters at the stage primarily. Sorry, long answer to your question and I extended it beyond what you asked about but hopefully that's helpful. Oh no, that's helpful. I'm looking at it, I guess it's on return but it's more, I was thinking for policy changes if I can show an investment, if I'm an investment firm, I can show that unless you give some type of credits or help me turn around a renewable so that I can get the same kind of investment through taxes, but I wouldn't oil and gas because I got to pay these pensioners to give you that block of data that companies can take to various governments and say, this is why we're not moving as fast as we are, we need some more help. That is, that's a great question. And so we've been strongly encouraging and CAPS clients have been strongly encouraging them to extend the policy levers that are modeled beyond essentially carbon price into a range of different of the large policy levers for the major sectors. And so when it gets there, then for sure, it'll be useful not only for companies that are lobbying for policies and measures like renewable energy standards or feed and tariffs or whatever it may be, also then something that's available to policy makers to make their own deliberations around because this flows through, sort of it'll be regionalized macro to be able to look at, okay, if I dial up and down various types of just general stimulus or tax interest rates, different kinds of things, looking at that and looking at non, let's say non-green programs versus those policy measures to be able to make some quantitatively informed decisions about that. Okay, great, thanks. Great question, thank you. Okay, and I know we only have five minutes left so I'll be super quick. So PCAF came up with a methodology or definition for carbon emissions. Sorry, just gonna jump in to say that PCAF is essentially a large consortium of banks and lending institutions that are kind of developing and implementing carbon accounting based standards for their activities. Go ahead, Heather. Yeah, thanks for jumping in there. But they came up with a methodology to calculate emissions for sovereigns or countries and OS climate worked with Alliance to basically implement that methodology. Again, federating data sources like UNFCCC and Primap data and a few other key sources and be able to pull that together and actually produce the data set of the carbon footprint for sovereigns. And that's available. It's a project that we're basically moving to phase two on. Right now we've got the data set produced. It was done in a manual way. And now as I mentioned at the very beginning of the presentation, we're implementing our data as code data mesh around these main data sets so that as I mentioned earlier, if UNFCCC or Primap updates the data set it'll automatically trigger and that entire data pipeline from pulling them those key data elements into and performing the calculation and creating the data set will all be done automatically through our data as code mechanisms and all be tracked and versioned and things like that. So that's an effort that's currently underway but we're super excited by that. And there's a lot of interest in this particular data set by the community. So that's good work happening. And then- And just an example of like who's interested in that. So I had a couple of weeks ago had a great discussion with the person who's the new data interface, I guess, with the financial community for the UNFCCC and Todd is very excited about examples of this and particularly the infrastructure that's underlying which could potentially be used for a range of other similar things that are ambitions that the GCAP has in the UNFCCC. Sorry, go ahead, Heather. Yeah, no problem. And I know like we literally have like two minutes left so I'll just touch on. We have one other work stream which is the data extraction AI work stream that just to refresh, we have 75,000 corporate ESG and sustainability reports that S&P Global contributed. So we're using machine learning and rules-based techniques and natural language processing to extract key metrics from those reports so we can create a data set. So won't spend much time on this but at the end of the day it's about pulling out information from unstructured data and making data sets with certain key performance indicators whether it's scope, emissions information or targets and making those available. Was this a vector database then that's being created for ESG data for the AI model? I'm sorry, I'd quite- As it says, so I'll look at your processor. So is there out there somewhere now a vector database? That's the AI models are using. You say vectorize the data with the vectors? Yeah, I think I'm not sure if it's when you say vectorize database. So the models, they've been trained and now they're kind of being retrained of the 75,000 reports. So there's actual work underway in terms of validating the accuracy of the model. So we don't necessarily have the full set of, you know, here's all the metrics from the 75,000 reports at this point because they've been trying to train a subset of the reports. But yeah, if you're interested definitely can send more information about what they are. Yeah, I was looking for a vector database that's got carbon emission data, I can't find one. So- Got it, okay. It might be proprietary, I don't know, okay. Yeah, I definitely can share where they're at and what work they've done so far but it's all open source and available. So- Oh, okay. I've got to apologize because I think actually both of us need to jump, I definitely need to jump onto another call but you feel absolutely feel free to follow up with any other questions to us directly. Jeff, thanks for the questions that you raised. A great presentation, thanks. Super, well, great to be in touch and Sherwood, thanks again and David really appreciate it. This is great, we appreciate the opportunity and likewise if you push this thing out to the rest of the SIG folks are absolutely welcome to reach out to us, we'd love to talk to the extent that again, they might be interested in contributing or to leveraging the tools in their consulting or potentially to host a project with us, so. Thanks, Truman, thank you, Heather. Thank you, have a great day, bye. Bye-bye.