 Good afternoon, everyone and welcome to the Green Mountain. I'll chair the board and I'll call this meeting to order. The first item on the agenda is the executive director's report and I'll call on Susan bear Susan. Thank you, Mr. Chair. I have some meeting announcements for next week and then I have some public comment announcements. First, I want to remind folks that next week, the board will be holding hearings via teams on the 2023 qualified health plan rate requests. So on Monday, July 18th, starting at 8 a.m. we'll hear from Blue Cross Blue Shield on their request on Wednesday, July 20th. We'll hear from MVP starting at 8 a.m. on their request. And then on Thursday, July 21st, we will be holding a public forum on these rate requests and that is from 4 to 6 p.m. Again, all of this is via teams. There will be a physical location via open per open meeting law and that is going to be at 144 State Street. But to make clear that all of the meetings and the participants from the board and the insurers will be participating through teams. I also related want to remind folks that we are taking public comment on these plans and these rate requests. So we received these rate requests on May 6th. We opened up a public comment period that started on May 9th and that will close this Thursday, July 21st at 1159 p.m. So we'd encourage folks to make comments either electronically or through our website or through email or you can always call the office to make those comments and all that information is on our website. And then in terms of additional public comments, we also have a new public comment period starting today on the hospital budget submissions. So we will open that comment comment period again today, July 13th, and we will run it through August 30th. The information on the FY23 hospital budget review can be found on our website under public comments as well as under the hospital budget page. The board will be holding public hospital budget hearings starting the week of August 15th and the board's deliberations will begin on August 31st. So we ask that you submit your comments by the 31st in order to be considered by the board during its deliberations. I apologize for the barking dog. We are also accepting public comments on on vcures. Hold on one moment. I think I should shut the store so you can actually hear me. I am so sorry. You're probably still going to be hearing him. But so we are accepting public comments on the reporting manual update for vcures. We have our claims database and you can submit those comments on our website as well. We'll be holding a public hearing on Tuesday, August 2nd to discuss the changes. So please, we want to hear from you on any of your comments. And last but certainly not least is that we are conducting an ongoing public comment period on the next potential all pair model. Agency of Human Services and the governor's office are leading those negotiations on a subsequent model on the all pair model. So we will share any of those comments with them. And I will gladly turn it back over to you, Chair Mullen, so you don't have to listen to the barking dogs. Thank you, Susan. The next item on the agenda are the minutes of Wednesday, June 22nd. Is there a motion? So moved. Thank you. It's been moved and seconded to approve the minutes of Wednesday, June 22nd without any additions, deletions or corrections. Is there any discussion? Hearing none. All those in favor of the motion, please signify by saying aye. Aye. Any opposed, please signify by saying nay. Let the record show that the motion carried unanimously. At this point, the main purpose of today's meeting is to talk about hospitals and quality, and I'm going to turn the meeting over to our own Michelle degree to tee it up for us. So Michelle, whenever you're ready. Okay, can you guys hear me okay? We can. I see not intense. Okay. It's always weird in the office because you hear yourself about six times around you. Thank you all for giving us the opportunity to talk today. I'm just going to do a really brief introduction and then let VPQ really run the rest of this meeting to talk about the workgroups that the board participated in. So a little bit of background. This meeting series, this quality framework meeting series stemmed from an office of rural health grant proposal way back at the beginning of March and 2021. The purpose of the grant application itself was to support collaboration between at this point, specifically in VPQ and GMCB to assemble a uniquely designed hospital quality framework that was designed to support the workgroup. The measures that were determined. So I should probably tell you what it was for the purpose of the grant application itself was to support collaboration between at this point, specifically in VPQ and GMCB to assemble a uniquely designed hospital quality framework that we could eventually work to incorporate into our board's regulatory processes. So namely hospital budgets and the information that Allie, I believe I see Allie's name, but I see Kathy's face. So I'm not sure it's going to present to you is just sort of about the those meetings and the end results, which we're sort of using our platform here as an opportunity for further public comments. So the board itself will provide comment back to VPQHC that'll run through me and Susan, but this is an opportunity for others to sort of hear about the work if they weren't involved and to comment and that way VPQ has an opportunity to share any feedback before the close of this period, which I believe is the end of August, but I'm sure Allie will correct me if I am wrong. So with that, I'll turn it over to VPQ. Great. Thank you, Michelle, for that great introduction and that background and the project, and for providing the venue today to share the results of our efforts to date. Thank you for your time today. I hope to do three things. Talk about the difficulties with hospital care quality measurement in Vermont, being a small and rural state, share the collaborative approach we used, and in designing a framework for monitoring that quality of care, and provide right suggestions from proving the framework. And in terms of the timeline, I was kind of hoping for a two week period, the close of this project is end of August and so I'd like a little time to incorporate the feedback into the final report, but we can negotiate that. So I can start with the presentation. I'm not looking at it on hard copy but you're not looking at it yet, so please bear with me. So as Michelle described this, this work is funded through the Vermont Department of Health's Office of Rural Health and Primary Care. We're very grateful for that support. And this presentation will cover an introduction so a little bit about myself and the project. We need a description of the problem that we are trying to solve the collaboration the people who came together to work on the project and how we organized ourselves. The activities so how we tackle the problem. The result I'll show you the first draft really of the measure set, which is part of the Vermont hospital quality framework we're drafting and public comment and invitation to share ways from proving the draft framework. So for those of you who I haven't met. My name is Ali Johnson, I am new to VP QHC. I joined in December as a quality improvement specialist after retiring from state service I served 25 years at the Vermont Department of Health, mostly in cancer surveillance and research. My background is in database management epidemiology program management and having having been a founding member of Vermonters taking action against cancer or statewide cancer coalition. I have lots of experience in garnering consensus and stakeholders and trying to really find measures that mark how well we're doing to move the marble on a big public health problem in that case such as cancer. I'm really excited to be part of the VP QHC team and to share about the project. The purpose of this is to design a framework of meaningful metrics that provide relevant information and accurately reflects the hospital systems quality of care within the healthcare reform environment in Vermont so a little bit of a tall order but I have, we have a huge group who helped with this and we're continuing to work. So the observed need, what, what problem exists that we're trying to solve. Well, as this group knows, there are so many possible measures out there for hospital care quality. And there are lots of report cards and ways of publishing all these different metrics, and it tends to create confusion. So we are hoping to do more alignment among regulators, but kind of have a gathering place where consume all different kinds of individuals consumers policymakers regulators healthcare professionals can come and quality professionals can come to understand how well from on hospitals are providing care. So here are like a dozen of the different hospital quality reporting programs. The hospitals are, you know, involved with, and, you know, part of our work is trying to minimize the impact on hospital quality professionals in terms of sharing data and and submitting data we we found mostly passive ways of obtaining the information to ease that burden. So in doing this work, these are considerations or as I like to think of them constraints in the equation. Things just to keep in mind and there are three of these slides so this kind of might be a lot but as we mentioned hospitals are engaged in many healthcare quality programs. And there are many platforms out there that display the data. Many report cards claim to speak to hospitals quality of care but not all of those report cards are created equal and part of what we did in our group process was on listen to an analysis by Jason minor from UVM and the Jeffords Quality Institute, where the different report cards are kind of graded based on national criteria. And so that kept kept us thinking about, well, what, what best practice do we want to use for our report cards, our report card, our framework. Of course, Vermont is unique and any proposed measures must take for months unique characteristics, typically being small being rural into consideration. Vermont statute tasks at least three organizations that we know of in assessing the quality of health care delivered across the system and so I've linked the statutes that relate to the Green Mountain Care Board, the Vermont Department of Health, and the Vermont program for quality and health care. And I've seen that this presentation is available online and so participants can access the statutes by these links if you'd like. So best practices to convene a multi stakeholder committee to select the measures and determine a process for ensuring that any set of measures continues to stay relevant in this environment. And as part as part of the budget review process, though, we understand that the limitations of any quality framework must be made explicit, and hospitals must be able to tell the story behind the metric. So, that's sort of setting the stage. And Susan, our chair mullin how how should we handle questions. Should we wait until the end. I prefer to wait to the end if possible, and we'll start with the board first, and then we'll go to public comment. Okay, great, then I won't solicit any until the end. Thank you meant to ask that earlier. So I want to share a graphic now of all the individuals and organizations that came together to work on this problem. And you can see five sectors of health care represented here government insurers education and research hospitals and providers and consumers. There are 56 individuals who participated representing 25 organizations, really happy to say that we had actually for consumer representatives on the team. And you know, just kind of a takeaway for me at this point in the process is, I realize that inclusion and consensus. are related and sometimes inversely and this process aired on the side of inclusion. So, we, we did get some consensus but just keep in mind the size of this collaboration, as we share some of the proposals with you. So how do we organize ourselves. We have a work group charter that, that has the purpose of the group, the business case, the scope of work, the schedule. And while we have an appendix that's work group members and processes, and we have a whole list of resources that I'll show you a screenshot of that we keep on the dpqhc website there's a little document portal that work group members are privy to. Before, before getting into what our work group did I like to recognize the, the work that Michelle mentioned early on, the dpqhc team Green Mountain care board staff and Vermont Association for hospitals and health systems for laying the groundwork for this project in their August 20 well in there sounds like March 2021 proposal but August 2021 report building a hospital quality framework. I mean, this was all done before I joined the organization in December and so I had a wonderful start, you know, kind of a project that was already packaged and just ready to jump into. I'm really grateful for that. So here's our work group timeline. You can see that it's just the months of January through August of 2022. And I have these satisfying checkmarks to take you through the timeline and our activities. So we convene the work group and establish the work group charter early on. We also set some meeting guidelines, and I think they worked really well in the meeting facilitation in terms of being able to bring different voices and perspectives to the process. We realized early on that our membership lack diversity. And so we recruited new members to try to round out that representation. We had some orientation to the Institute of Medicine six domains of health care quality, the inventory current measures, you know, all of those. We have three statutes and all the material I showed you all these different reporting programs and hospital requirements we said what, what are we already collecting we also fielded a survey of work group members, asking what measure, what measure programs are you participating in and how are you using quality measures like this. We reviewed the survey data and we had, we proposed measures. In May we evaluated or scored the proposed measures. And I'll share the scoring criteria with you in a minute. We finalize the proposed measures it was more like July than June but it's it's still a checkmark. And here we are submitting for public comment. And so happy to be at this point in the timeline. This Friday, I will have something on our portal and help directions to that in this presentation. I have something on the portal will be the draft list of measures that we are accepting feedback on. And so then I'll be compiling integrating public comments into August and and running it by the work group membership to get the final draft to the health department by the end of August to meet our deliverable. World Wind and I'm really delighted to say that many of the people on this call were involved and many others to be to show my real appreciation for speaking our appreciation many things to our presenters we had six different presentations over the course of six months. To just show various perspectives of what's important to the different sectors of health care, health care quality. So this is a screen capture of our document portal and this is not even complete so we have lots of resources that were studied and used and to understand the evidence for quality measurement and best practice. And you'll see in the lower right hand portion of the slide, there is a link to this to this portal. Well, it's a link to this page and then you have to go to the portal, and then put in our secret squirrel password a framework 123. This was, it's just a way to kind of. So it's clearly not private just a way for people who are very intentional in this type of work to have access to all these materials. Okay, so couple of considerations for proposing measures. You know, I mentioned that assessment we did where we inventoried all the measures so we wanted to say okay. Let's start with what hospitals are already doing. We know we have Act 53 also known as the Vermont hospital report card and again these are all parameterized links so I encourage participants to go and look at these different ways of displaying hospital quality data. The Medicare beneficiary quality improvement project and hospital level metrics under Vermont's all payer model. So we put all of those in a basket. These are the technical guides to the data dictionaries and how these measures are described and calculated. And the rationale for capturing them. And then we looked, we looked to other resources. If there were topics identified by the work group that were very important to collect mental health is a really good one. There, there weren't really existing measures we had to, we had to continue looking for them. And these are some of the places that we looked at the national quality forum. We followed the rural health work group recommendations the CMS measures inventory tool and the national quality forums developing health equity measures report. That actually that was published in 2012, during the six months of this work. The national quality forum took down their health equity. Like they had described certain measures as being health disparity sensitive as like a quality of each of the measures that is in their big database. And they stopped doing that partway through this process so luckily I had captured that little that information early on, but that was that was a resource for us for a while. So here are the scoring criteria I mentioned earlier. First, they have to be feasible to collect. So we looked at whether measures were required by critical access hospitals, larger hospitals. We looked at whether it was important to collect so had the work group identified as a priority, does it align with these hospital reporting programs that hospitals are already involved with. How does it meet the NQF endorsement criteria being a small rural state, we thought the measure needed to each measure need to be rural relevant and resistant to low case volume. I mean, as it is, you know, with our hospital report card, just due to the small number of cases for certain measures has some hospitals have no data to report or display, just due to low case volume. So we were looking for measures that could be reported by the highest number of hospitals possible. And the group thought it was really important to have an established pathway for how a hospital could affect a measure. And so you'll see on the final list there are a few measures that are kind of these systems measures that have to do with transitions to care that are hard to know how to attribute to the sending hospital or the receiving hospital. And so those are a little bit aspirational I would say in terms of a type of measure because we're still figuring out how like the sending hospital could influence a measure if the receiving hospital doesn't have beds for a certain kind of patient. So this is, you know, things to consider here. So basically in a nutshell we had 44 measures that met some basic criteria and aligned with the Institute of Medicine six aims for health care quality that was really important. We wanted to represent each of those domains. We wanted to have a mixture of process outcome and systems measures, and we wanted to have the measures be in topic areas that were important to collect. So we took these measures we, the match scores are, we had three quality professionals compare the measures to the scoring criteria and come up with a score, we then shared that out with the work group members in a survey. And provided the evidence base for each of the measures. The work group had this survey they kind of voted on within each domain, the most important measures to collect. And then we had a final review. Dr. Don Dupuis from Copley Hospital was the sole volunteer from the work group and we, we corrected a couple of things and we took out a couple of payment measures that were going to be too difficult to collect and added a couple of readmission measures so that shook out to 18 measures, which was perfect because the work group had been surveyed very early on in this process about the number of measures to collect and they wanted to keep it under 20. And so we did it. So, okay, the result you've all been waiting for. So here, so they're basically, so my presentation boils down to two slides. Here are the first three of the domains, the Institute of Medicine's health care quality domains. And here are the measures. And I'm not, I'm not really, I think going to spend time, I guess on each measure, other than to say this will be, this presentation is publicly available. I can provide more information upon request behind how this measure is, how it is calculated and where the data source would be and so on. Equity though I will say was a really interesting one. I actually presented this to and I want to get the name of this organization correct. The health disparities and cultural competence advisory group. I met with them a couple of times to ask about health equity since this is really an emerging area of health care quality measurement. And we understand that screening for preferred spoken language for health care will be an aspirational measure because claims and electronic health records may or may in hospital practice may all vary on this so it may be hard to compare. But we wanted to include this was the best equity measure we came up with and this council agreed. I had a great conversation with them. And I understand that the social determinants of health is an emerging area to for data collection. So the final three domains patient centeredness safety and timeliness. And you know, you might, if you're in this world and in this kind of work day to day, you'll see that many of these are already in the hospital report card that the Vermont Department of Health publishes. So behind the scenes, there is a spreadsheet that contains all of these characteristics for each of these measures. And I've worked with Terry Hota from the Vermont Department of Health. She may be on the call. So kudos to Terry for reviewing all of the measures making sure I have this metadata correct for each of them and identifying the data sources. So that's the list of measures. The framework, I think of the framework as list of measures, plus the portal, or the way of displaying the measures and so we only have the list of measures really available for public comment at this point but I wanted to share that we do plan to have this comparative way of displaying data. And we might also call it a dashboard. We want it to be easy to find easy to use, have good explanations about what we're measuring and stories behind the curve. If hospitals are doing particularly well. We want other hospitals to be able to scale what what the, you know, the high achiever is doing so that they may improve quality in their hospital to we want to have appropriate benchmarks and be able to display observed versus expected values. Hospitals reported they really want to be able to show their own individual trends. So we'll have a way to do that too. And something that's not on this slide is that we kind of want to have a landing page that's very consumer friendly that might point out to other resources for things that might not have gotten on the list. For example, patient safety surveillance system and and being able to like pay pay for your bills the Vermont healthcare advocate is helping us with that. So I this is the part where I'd like to invite ways to improve the draft framework. So here are the instructions again by by Friday expect to have this link that is pointed to the blue arrow parameterized so there would be a document behind that. This is the link you would use to access the landing page where comments are due, and it will basically be just an email to me. And just before wrapping up wanted to say that this is an iterative process. This was a first try we understand this is not going to be the end all BL final product of having a quality framework, but it was a really good first try and it was very inclusive. I understand that more outcome measures of high quality and reliability are needed. My door is open to continue the conversation and we're going to see after August how the work will be continued with the Vermont Department of Health and and what comes next for the work group. That's still kind of to be determined. So we've we've recognized for the for next time these two types of measures measurement reporting systems that didn't go into the original framework and will absolutely be considered more next time. And are there any questions. So thank you very much. Let's start with board questions and members of the board. I guess I'll go in reverse alphabetical order and start with Tom Walsh. Thank you Kevin and thank you Ali. Great job convening the so many stakeholders. Right. That's an important aspect to this that I think. Sometimes when we we go to great lengths to convene stakeholders it gets hard to satisfy every stakeholder. And that can be difficult. So just job well done. And I look forward to seeing more. I was wondering though if you could explain a little bit more about the surgical outcomes and the promise measures and what what's behind the decision to leave those for next time. I'm not introduced early on so they didn't go through the rigor of being considered scored. And surveyed. So I just didn't think that it would respect the process and timeline to just sort of wop them in at the end. One of the things that I thought you might say is that the way that we prioritize we've gotten to 19 worse trying to stay below 20. And each of those items that are linked has multiple measures within it. And so yes. Right. Would we meet our goal of staying below 20. So if it does come down to trying to decide what might be cut. I did notice that I think it's in the effectiveness bucket or the first bucket. There are three measures of 30 day readmission. There's admission. There's heart and there's pneumonia. And I don't know that there's really much added information by comparing heart versus pneumonia. We're going with just the overall 30 day readmission might get 90% of what we need. And so we could free up two places with that type of thinking. If we went through the whole list thinking like that, we may be able to include more of the surgical outcome measures where there's so much of the cost and health care. And then the discretionary cost particularly is around that area. And patient reported outcome measures, which I believe in my understanding of the literature is that that's an area that's going to become more important when it comes to focusing on equity. So we're going to be looking at all patients patient reported outcome measures and then we stratified by socioeconomic status or variable. Then we'll get more insight into what's actually happening to individuals in different straight up. And so are the promise measures are able to be stratified by demographic information. I'm not sure that the H caps the that's another patient reported type of measure. I'm not sure those are able to be stratified. There would need to do work to do that. We're collecting them within the state, and then we can marry that with other data sets that have variables for gender, race, ethnicity, income, education, those type of things, we can do the stratification. So it's just this is a really great start. And there's a lot to be proud of with what you've done and how you've done it. And so I hope we keep going. And thank you for being part of this. Yeah, it's been a pleasure. Okay, next we'll go to board member Pellum Tom. Well, this is dizzying if I can get the word out right. You know, it's, it's data is, I mean, I'm just thinking about my experience coming on the board and, you know, kind of poking around and all these little corners to find out what's going on and kind of looking at the, I think it's the US census data on cost per capita. You know, and the data was 2014 and in 2017 that still seemed kind of relevant, but now it's irrelevant because it's old data. And so, you know, my sense is, is that this is a moving environment all the time, you know, and a moving environment that can't be followed, you know, with any sense of currency. You've got, you know, disease incidents changing over time, you know, you look at some of our stuff and the all pair model having to do with pre diabetes and, you know, the data changes over time and people want to be current. So, you know, kind of like reform and infrastructure, you know, to me, fixed perspective payments are very important, but three years from now, they might not be important at all. You know, so it's kind of stuff comes and goes and I'm kind of sitting here thinking about my experience in Arlington with the Arlington Memorial. They can't feel library and that that that may be a model that is where there is a few librarians that are neutral about the data, but are just there to be traffic cops. You know, they they know what the data is, what its relevance is, and can their appeal will be that they can shorten the timeline that it takes people to get up to speed with what it is that they want to know. And, you know, you know, I could go into the library in Arlington and my mom could too and say I want to book about this I want to book about that. And they knew where it was, you know, they could tell you kind of what was in it, but they weren't experts on it. And so some kind of a, you know, dashboard could be created or, you know, kind of a system where where people are directed in directions that will be helpful to them, and also directed in directions where there is I mean, for certain periods of time, certain things might be important to a certain group of people than three years later, you know, whatever that importance was has dissolved, and people have new moved on so not only kind of, you know, directing people into data that is value added but also directing them toward people who have a common interest, you know, that can share share the effort of of you accessing and utilize that that data. I just I just worry that trying to set, you know, for a certain set of standards for metrics, you know, boiling it down to a specific number is really a over time is a process that that can be very difficult to sustain and, you know, so I, you know, I can't be very helpful here I just think about my own hunting and pecking experiences trying to sort through all the data that's available, you know, and then boil it down to what's what's in my view and it's a person to you what's the data that is important to a critical path, and of getting, you know, change affected or getting to understand the nuances of an issue. And, you know, I don't think that there's any system that can keep pace with that, accept a system where there are very capable people in a well understood place that understood where the data is, and can direct people to it. So they can work with it however it is that they want to work with it, but at least they're aware of it. So I kind of wish you luck. I think, you know, this will be a point in time experience. And I think focusing on how to take this experience and make it give it a life that keeps on giving. And I'm not quite sure a dashboard does that I think that's human beings, you know, in a central place that understand all the data, you know, sources that there are can someone can walk in the door and say I'm really interested in this and can be told we'll go there go there go there or maybe the Middlebury College library has some important stuff just people that know kind of where the data is, but aren't married to it they they're just, you know, good librarian traffic types. This is reminding me of the Health Department's data encyclopedia, where they have this information about what each data set contains what some of the the barriers are to each data set, you know, ways that it might be used might not be used. And who's in charge who you can contact for more data for more information and for more data. And that's like one little nugget of all the data that are available related to health that's more of the public health data sets but some healthcare. And, yeah, so a resource like that might be helpful. I'm hoping that VPQHC can continue to be kind of a neutral place where Yeah, where this information can be warehouse. Well that sounds like a natural in my understanding or sense that for VPQHC, you know, is this, you know, they're the data warehouse they know where it all is. And they're not judgmental about it, and can point people in directions that you help them get acquainted with and use the data that's available. We'll keep doing our best. Thank you. Okay, next we'll turn to board member launch Robin. Hi, thank you very much for the presentation. I guess my question was whether during the course of the process, the group discussed the ways which the framework might be used and by whom and if you can give a little sense of that discussion and what ideas came out from the group. Yes, gladly. Great question. Thanks. We talked a number of times about audience. So, who are we trying to reach. And we we voted on this we discussed it. That is one of the areas I'm not sure we ever came to consensus on. So, the group, the group purpose was initially kind of a purest thing of we want a set of measures that represents that is representative of the quality of care that hospitals are providing. We want the quality professionals at that those hospitals to be able to tell their story. So, when asked, when I asked the group about the audience you would think the answer would be quality professionals at every hospital, but the answer kept coming back to consumers. But evidence shows consumers don't use dashboards to make care decisions, particularly in acute situations. But we just kind of so the answer is still a little bit to be determined. We're really trying to we tried to pin that down during this process. We expect the dashboard to be be kind of consumable by anyone. And so we want the front facing part of this to be very consumer friendly. And we hope that the metrics provided will be helpful to a broad audience. Does that answer your question. Yes, I was just trying to see sort of a connection from where you started which sounds like something that we might use in our hospital budget process to where you landed, which still may help us get there. But obviously there's more steps that would need to happen before that takes place. So I was just trying to understand what the group itself wanted or thought the data would be useful for. Well, and we were thinking about the hospital budget process. I mean, I think what you're, you're getting to is value so quality over price. And we are chart work group charge specifically omitted any financial consideration. And so I think it will be a challenge moving forward to compare cost and quality in a meaningful way where those two things are really tied for each hospital that you can kind of connect a line there. It was a little bit outside the scope of this round. Thank you. Okay, next we'll go to board member homes Jessica. Great, thank you. I have my colleagues. Thanks and pulling all this together and I know there was a lot of work in and hurting cats of that size in terms of building consensus I can appreciate all the hard work. I have a couple of questions, what and comments I suppose, if you're think soliciting feedback. Yeah, as I know you are one of my concerns I guess is and I recognize the issue that in the scoring criteria, one of the, you know, scoring criteria is resistant to low case volume fully understand why you would want to do that low and can cause statistical issues with with a low case volume. But those are some of the cases I worry the most about consistently not reporting where there's low case volume. To the degree that we know that they're and I've talked about this at lots of board meetings but we know that there's a volume quality relationship. So, what I would love to see is in a quality framework is at least some reporting of instances of low volume cases, where we know the evidence is strong that those low volumes are often associated with poor outcomes for patients. So are you saying this is an instance where there's a hospital that's doing really low volume on this particular case, and we know that there's a relationship between volume and quality so. And it's falling below standard measures of minimum volumes to maintain quality so something like some way we can measure that. I'm just for clarity. Are you talking about procedure, the numbers of procedures so quality measure might be the number of complications over the number of procedures. Are you saying that we should report out when we have like less than five less than a threshold of those complications, or are you saying we should report out hospital or you know we should report out probably for all hospitals that the number of procedures that the denominator essentially, even though it's small instances where we might be worried about that number falling below a case number. I'll give an example of an orthopedic knee replacements joint replacements right when that number falls below a certain threshold that's widely accepted in the medical. Got it either at the hospital level or at the provider level. I think that's worthy of understanding and reporting out in terms of inequality framework that those numbers have fallen below widely accepted procedure numbers to maintain, you know, certain levels of quality. I think that's the flip of understanding, you know, thinking about we do live in a rural state where we are going to have low case volumes. When do we worry about low case volumes, I guess, is is is what I think needs to be helpful to think about in this quality framework. It's at some to some degree, you know, I think births is an area and that's again that's an issue of quality and access. But there are areas where we start to worry about the births falling below some critical threshold. And when do we worry about that. And I'm not a medical provider and I can't know I don't know what those numbers are but there's evidence out there and so you know how can we start to incorporate those types of measures of quality in a different way when it comes to the case numbers where we know there's that volume. Yeah, I appreciate that insight so we we don't want to appear of like sort of well allow we can't hear you about small numbers. We want to be able to represent that as well. Okay, thank you. And I would echo my colleagues, you know, Tom's concern or interest in surgical outcomes I guess I would say I think you know, to the degree that that's still a priority going forward it would be a priority for me I think a lot of the data that's in this framework is data that that is publicly available what's not been really widely available or understood at least it's from my perspective is surgical outcome data. That's not something that we can easily find identify understand revision surgeries that's not something that we know about. So that would be a huge value add I think of this framework because that data is not already out there and all the other various I know you had a great slide of all the places where there's data that comes in surgical outcome data I don't think is is in many of those some of them complication rates and infection rates but I think that there's probably a pocket there that's that's less full in some ways is that right or am I wrong. Yes, and surgical outcome date. So surgical outcomes were a priority very early on in the group. We attempted to find sources of information for this throughout the process. The two sources I mentioned at the very end became revealed after the measure list was finalized. So, and one of them that I've looked into just very briefly only has three hospitals reporting. So, it's such a, there are a lot of constraints in the equation and to try to have it be representative of the whole system, you know, of the topic areas that you want. There's a lot to kind of balance. And I think it's the group. Kathy's had her hand up. Did she want to elaborate on that? I just want to throw in also regarding the National Surgical Quality Improvement Program, the surgical outcomes database is the gold standard. I would say just an overall but very specific to surgical and surgical outcomes is a wonderful system and organization vary to things expensive and labor intensive to collect that data. A number of years back, some of you may not be familiar, would be since you've been prior to your arrival with the board. VPQHC worked with a number of Vermont hospitals, both large, medium and small on the NISQIP project under the SIM grant. And, you know, that support was no longer available, but it was very extensive to get that system in place and, you know, would take a commitment of resources to be able to do that. But, you know, if that becomes a priority and we can find a pathway to funding, it would be great to re-institute that. But, you know, the resources to submit data to NISQIP, you know, it's, as Ali said, three hospitals continue to do it because they prioritize that information within their organizations, you know, service lines and quality improvement programs. So just wanted to make you aware of that. Thank you, Kathy. That's hugely helpful and it's something that in the past we've brought up and talked about maybe having, you know, is there a funding source? Would we allow, you know, how do we incorporate consideration of NISQIP funding in our hospital budget process to incentivize hospitals to do it? So if the group decides that this actually would be a useful gold standard that we should have all hospitals contributing to, maybe there are some mechanisms by which we can incent hospitals to do so if it's truly going to be informative and all of that. So creative opportunities to think about how do we incent and fund, you know, adherence to and submission of NISQIP data and all of that would be, I think, a great area of exploration. Just a quick question under the effectiveness bucket when we are talking about 30-day readmission rates. When you say hospital-wide, do you mean system-wide or is it readmission just back to the same hospital or does it capture readmission to any hospital? I believe it is the same hospital. Kathy, could you please correct me if I'm wrong? I think you're out of mute. Yes. Due to the current limitations in the way our data systems can track that information, we're kind of limited to, you know, returns, readmissions to the same organization. And I think that's where, I think, if it wasn't the most recent meeting, maybe two Green Mountain Care Board meetings ago when we talked about Vital's new regional relationship, that gives me great hope that we'll be able to get into larger data sets to be able to conduct those more comprehensive analyses and, you know, being able to do that just within our state as well. That would be fantastic. I think that would be helpful. I think, you know, if a patient doesn't return back to the same hospital but goes to a different hospital because of a whole host of reasons, that would be just as important to track in terms of quality. So if there's, you know, let's maybe put a placeholder and hopeful that we can get that data in the future. And my last comment point would just be in the dashboard. You mentioned Ali Benchmarks and I think it would be really helpful to have, at least from my perspective, what are the triggers that raise concerns? You know, like almost like a red, yellow, green. When is it red? When is this quality measure red and something to be concerned about? And to Robin's point, then we have to figure out, well, what are we going to do with the data and what does it mean when it's red? But I do think, you know, understanding what is the benchmark, but then when you're far off, how far off the benchmark do you have to be where you've got red flags? So I think, you know, and color coding is wonderful, right? Red, yellow, green is often just very easy visually and appealing and understandable by everybody, hopefully. And we'll need a lot more input from our quality partners on that. There was some discussion about benchmarking and it's going to really depend on the measure whether hospitals want to compare to their own past performance to a state average to a national average regional. So more to come on benchmarks, but I think I'm very understanding of your suggestions and thank you. Great. Thank you so much. I really appreciate all this work and I'm looking forward to seeing more and hearing more and figuring out how we can use it. Okay. Thanks for having me. So at this point, before I go to public comment, I wanted to tell everyone that we will have a portal on the site for public comment on what people have seen so far and any suggestions on how to improve that. And just to reinforce the previous conversation with board member Holmes, we've all heard so many stories about someone who had what they believed was less than quality care and went to a different place the second and sometimes the third time. And, you know, the old adage, fool me want shame on you fool me twice shame on me. I think we're all kind of ingrained where if we're not happy with what the outcome was with the first place we go we go elsewhere. So that's such an important piece to track and read missions of any kind. Really are inefficiencies in the system. So with that I'll open it up to public comment. Does any member of the public wish to comment at this time. The first hand up I see is Walter carp under. Thanks, Kevin. Jessica echoed my thoughts on the low volume care so I won't. So I'll let that lie I just wanted to ask what are we going to do with all this data. I mean this healthcare system in America and Vermont has so much data that we could walk to the moon and back from it if we stretched it out on the line. So what are we going to do with this data. Hope is that we would hold people's feet to the fire Walter, and basically hold them accountable for data that shows where some improvement could be made at their particular hospital. And how would we do that. Have a conversation in public about something things start to happen. And Walter this is Kathy if I can just jump in there and I think a lot of the features that Ali presented. For example, the ability to apply benchmarks is how then individual organizations understand where their position like I'll call it where their position in the pack is. Are they a leader. Are they men pack, or they kind of falling behind, and that helps give information we take the data, turn that in into information that becomes useful in informing organizational priorities statewide improvement projects local improvement projects. The idea of all of this in and you know we first set out with this project really committed not to adding to the burden of data collection and reporting, but making making effective use of what's available to look at a more systemic picture to identify gaps duplications and areas for potential improvement. So I hope that answers your question. And I just want to reiterate, I think what everyone said many times, this will be an iterative process. This is version one coming out of the gate with hearing all the input of more than 50 contributors to this work and I think, I think this initial sharing that this initial part of the process to hear all the voices was so essential and you know that we have that memorialized in many of the documents. The documents are available on our portal. We're happy to share. And we intend to work collaboratively with our partners to keep this moving forward with future, you know, version two version three version four. Okay, next I'm going to turn to him Davis Ham. Thank you, Kevin. Keep this brief but this is huge just air total on reality here. I don't know the process that Johnson is talking about but I can tell you that we have formed. I was the founder of one of the founding members of people to see in 1988. Okay, we had we went through all of these issues. We had in fact had a huge statewide convention with all of the players, hundreds of people. The reality is the reality is, and Walter Carpenter has just got it did you got to listen to Walter here. The reality is first of all that nobody in the hospital business wants anything to do with public quality, because it can make them look bad. They already have data like this. This date the data exists you have right now that it came into you last October 27 you have I forget which consultant was a whole bunch of consultants. There's a problem that listed at least four hospitals that is doing surgical procedures well on the leapfrog tape, the leapfrog volume rule, where there's just too few to be they do they do them anyway. They have heard Sarah Barry from one care of a month say that that they are collecting. Okay, they are collecting revision surgery. The other hospital is just what everybody does so that he doesn't ever happen in Vermont ever. What happens in our mind is revision surgery. You can count revision surgery, if you want to. By doing anything about it, you can you can do the whole reason why that act 48 is so powerful and gives the remount cable power that dwarfs anything that any state in the Union has. What happens is that is in order specifically in order to solve this problem. If you have a if you have a place which is doing, you're doing now, you've got half a million people, 600,000 people you're doing 12 out of 14 No, no sensible policy person is ever going to think that that makes a bit of sense. So I don't believe any of it. I mean, the reality is, it's too dangerous for hospitals. They don't dare to do this stuff you and if you and the question and because even if you have that you have some data right now in your that you got last fall. Okay, that tells you here are four hospitals. I can't remember which one. Okay, but they're right in your data system. You, these, you hire these consultants and they told you here are four hospitals that just doing doing surgeries that they should not be doing. Can you do anything about it. Of course you can. That's because the whole way that you control this system is through the budgets. The thing I and and you know, people have put the whole United States, the hostile industry in the United States is terrified. Okay, terrified of actually looking at quality outcomes. Nobody live they look at checking boxes. Did you give it did you do this, did you do that did you take their temperature, did you turn up and stand up and turn around three times to do this is all things that you need to do. Well, what they don't look at is what the outcome is. And if you vote if you want to do this, you're going to have to get the outcome. Thank you. Thank you, him Michelle I see you have your hand up. I do I just want to clarify that the hospital I read mission measure to the same or a different hospital. So it is both ways. That's all. Okay, is there other public comment. Is there other public comment hearing none. I really want to thank Allie and Kathy and everyone who worked on this project. And, you know, I'm a little bit more optimistic that the results of this could be very meaningful. So, nobody wants to have bad outcomes. And the discussion of those will help everyone come to decisions that will try to minimize that from happening. So it's very important information and quality is one of the three pillars of our mission here at the board. And so we'll continue to work with you on this very important effort. So thank you. Thank you so much for having me. And I realized that I omitted the final slide, which is my contact info for public comment. I'll include that in the chat. Thank you. Great. And we'll make sure that gets posted on our website as well. So with that, is there any old business to come before the board hearing none? Is there any new business to come before the board hearing none? Is there a motion to adjourn? So moved. Second. It's been moved and seconded to adjourn. All those in favor of the motion please signify by saying aye. Aye. Aye. Any opposed please signify by saying nay. Thank you everyone and have a great rest of the day.