 Green Mountain Care Board's hearing of November 29th. I hope everyone had a nice Thanksgiving. And today we'll be hearing from Michelle Degree, our Health Policy Project Director here at the Care Board, and Lindsay Kil, our Data Analytics and Information Chief, and a number of other folks from Diva, Blue Cross, MVP, and OneCare for a series of presentations on OneCare, sorry, on the ACO results payer panel. First, I'll turn it to Susan Barrett, our Executive Director for her report. Yes, thank you very much, Mr. Chair. I just wanted to remind folks that there are several ongoing special public comment periods. If you go to our website under public comment, it will take you to that site. We have several ACOs that the Board is currently reviewing and in addition, we are looking at the health information exchange, strategic plan, and connectivity criteria. So we're accepting public comments on that process as well. And then two ongoing projects. The first is the community engagement as part of Act 167. We just wrapped up earlier this right before Thanksgiving, the community engagement meetings, but we are continuing to accept public comment. So please, please hop on the website, learn more about that project as well as provide any more of your comments. And then lastly, we are accepting public comments on a next potential model that Vermont would enter into with CMMI. That's leading that work. So we're sharing any of the comments with them and encourage folks to share their thoughts on the website on that issue as well. So with that, I will turn it back to you, Mr. Chair. Thank you. And we have the meeting minutes from November 20th. Although is there a motion to approve the minutes from November 20th? So moved. Second. And all those in favor of approving the minutes from November 20th, say aye. Aye. Aye. Aye. Aye. Aye. Aye. Aye. Aye. Aye. Aye. Aye. Aye. Aye. Aye. Aye. Aye. Aye. Aye. Aye. Aye. Aye. Aye. Aye. Aye. Aye. Aye. Aye. Aye. Aye. Aye. Aye. Aye. Aye. Aye. Aye. Aye. Aye. Aye. Aye. Aye. Aye. Aye. Aye. Aye. Aye. Aye. Aye. Aye. in 2022, a contract with the ACO to discuss quality and financial performance from that year. I know this is a time where it gets a little confusing. We're talking about 2022 today, while we are in the middle of reviewing and presenting on One Care's proposed 2024 budget, so definitely a little behind. We just want to acknowledge that and that we're talking about sort of different performance years here. So, let me move ahead. We're going to do a really quick background and then we'll go through the results. I'll have everyone introduce themselves as their slides come up. Just let me know when to advance. We've combined them all into one deck for ease here and then we'll go into board questions and public comments. So, just a reminder that today's discussion is under the board's ACO oversight authority and under the model agreement and ACO is a legal organization of healthcare providers that agrees to be accountable for quality, cost, and overall care of the beneficiaries assigned to it. The ACO's scale target programs must reasonably align in their designs across payers, which includes these ACO payer contracts that we're going to talk about today. And the quality measures that are within them and these measures, while related, are distinct and separate from the performance under the all-payer model quality reporting results of which we currently have through 2021 with 2022 anticipated probably in the second quarter of 2024. It takes about 18 months for those reports to come out, so now I'm trying to do math in my head. It's a dangerous game. So, a quick reminder here. We've got a crosswalk here to just sort of show you where all of these kind of intermingle and where we might see some sort of cross-pollination. We've got all of the all-payer model measures listed here. And while there is overlap through the payer program, any differences that remain are typically due to types of populations covered, right? So, in the Medicaid space we're talking about children and adolescents, you're probably not going to see a similar measure for the Medicare population. With that, Lindsay and I will go into, if you imagine me having a separate hat, putting on a CMMI hat and presenting on behalf of Medicare. So, I believe we're going to start with Lindsay. If you want to go ahead, I'll advance to the next slide. Thanks, Michelle. Good afternoon, everyone. My name is Lindsay Kill and I'm on the data and analytics team here with the Green Mountain Care Board. Just to reiterate, I'm about to talk about one care Vermont's Medicare participation in Medicare financial settlement. So, what this chart here is showing is year over year in the model are prospectively aligned Medicare ACO beneficiaries at the start of the year and then those who are ultimately included in settlement. These are two different populations because the program limits which beneficiaries can be included for the financial settlement. Those beneficiaries must maintain eligibility for the entire performance year or in the case where individuals pass away in that year, they need to remain a Medicare beneficiary until they pass away or they receive 50% or more of their primary care services in the ACO service area. And what you're seeing here in this chart where the difference between prospectively aligned at the beginning of each year and those included for settlement by the end of the year, that difference is growing over time. And that is because in Vermont over the course of this model, we've seen a substantial uptake in Medicare advantage enrollment and that is one of the criteria for being included in settlement is that you are not a Medicare advantage beneficiary. So, that's where that difference is coming over time. Next slide, please. On this slide, we have a snapshot of what is already posted on the website. It is the 2022 financial settlement spreadsheet. We also call it the shared savings and losses. And the way to look at this is the top down across the two columns for our Medicare population, the traditional Medicare population age and disabled is in the first column and ESRD is in that second and then we have the totals and then blended totals in the rightmost column. And starting at the top, you work your way down to look at first that perspective benchmark and then the updated benchmark, those totals, and then we can see our aligned beneficiaries, that total number being 45,972 for Medicare and their total person months. And that has given us a blended PBPM of 848 for 2022. Under that, we see what was actually spent on behalf of those beneficiaries and we break that out into the claims spend, which is that more traditional fee for service spend. And then we have the AIPBP fee reductions and AIPBP stands for the all inclusive population based payments. We subtract the adjustments that had to be made in 2022, which were inclusive of uncompensated care, a new program change for 340B COVID expenditures and sequestration to get our total spend. And from there, we calculate things like the quality adjustment, the gross shared savings and losses and net, which I'm going to go through in two slides for now. So next slide, please, Shell. So this slide is just showing sort of the top of the line spend for all of those beneficiaries in each year. And you can see how for in 2022, the AIPBP to fee for service spend ratio is pretty similar to 2021. But overall, through the course of the model, we are seeing proportionally growth in the AIPBP investment versus fee for service. So that's a good thing, I think. Next slide, please. And then this is kind of the meat and potatoes, that bottom half of the settlement spreadsheet, which again lives on our website. For 2022, we have gross savings and losses of 20.3 million. There's a cap on those savings and losses, and that is 9.5. And they have the, what the cap is, and then the ACO has the max of that. So they got all of it, 9.5. The quality adjustment is the big difference this year. And that, I don't want to steal Michelle's thunder, she's a coffee person, so I'll let her talk about that. But then the total adjustment numerically is subtracting that 786,302 dollars. In 2022, the ACO's risk arrangement was 100%. And so that adjusted capped savings and losses, subtracting for quality and subtracting sequestration is this 9.5. And then we always subtract the advance shared savings. And that gives us the net settlement adjustment for 2022, which is this 490,346 dollars. And I think after that is a discussion of quality. It is. Thank you, Lindsay. So as Lindsay just mentioned, there was an overall reduction based on one care quality performance in the Medicare program this year. I'll get to that in a minute. But I wanted to just do a quick overview of the Medicare quality performance and how it's broken out. It's in four domains. You can see them on the screen here. And for the 2022 performance here, it was a mix of pay for performance and pay for reporting. And we'll talk about that a little bit here. So past Medicare performance has kind of, we've gone through a little bit of an ebb and flow, some of it due to the public health emergency. Performance year one was 100% score earned just based on the first year of the program and operation. 2019, we had another mix of pay for performance and pay for reporting. For 2019 and 2022, the areas where it's pay for reporting only are areas where Medicare does not have a comparable benchmark to compare one care to. So automatically they would earn sort of the full points for those measures. So as you can see, performance has declined in performance year five here for 2022 and we'll dig into that a little deeper. So the caps results are the area in which one care's performance declined the most significantly and had the greatest impact on their overall score. You can see that here. The biggest areas that we saw a decrease in were health promotion. I'm getting to my notes. Health promotion and education, which had a three and a half percent decrease. We had stewardship of patient resources that was almost a nine percent decrease. And then the care coordination was about four and a half percent decrease. And again, those had the largest impact on their overall quality score. A couple of things to note here. If you recall, last year we talked about a change in how the cap surveys were administered. One care opted to get rid of or to no longer contract for phone calls in the cap survey. So these are paper survey based only. That was a change in the 2021 performance year. And as another reminder, cap surveys were not administered in 2020 as a result of the public health emergency. So no caps results for 2020 if you go back in time. Just wanted to flag that. I did pull 2021 performance year so you can see the changes. I will also add that in 2021 after a pause during the public health emergency, cap benchmarks were switched from a purely death dial based performance. So 10 was 10th percentile, 90 percent, 90th percentile to actual performance based benchmarks. So in some cases you'll see those percentiles. I think in some instances the 90th percentile might be 84.5 to 86 percent. They're much smaller ranges to meet some of those percentiles starting in 2021. Here are some of the clinical and claims based measure results. Clinical quality continues to use death dial based benchmarks across the board. You'll see that there are noted decreases here. Most of them are quite modest. We didn't see as large of impacts on quality score in this arena as compared to the cap survey results. There were also significant changes to the VT1 and VT2. That's just what CMMI called them because they're not typical Medicare measures. So the follow up after discharge from the ED for mental health or alcohol or other drug dependence and then the initiation and engagement measure. There were significant changes there to specifications. So I did not pull over the 2021 performance. And as I noted earlier, it's not a typical Medicare measure. So there are no performance benchmarks there. I really wanted to highlight some considerations through the three areas that we just looked through. First, again, performance on caps had the largest impact on score. I talked about the measures with the largest decrease and I do have the cap survey handy. So if any board members want to quiz me on measures that are included in those, I do happen to have them. In clinical quality, all of those measures were paid for performance in 2022. And performance on those, again, did decrease across the board, but those changes were modest. There was an addition of several codes that resulted in some denominator changes, which means, you know, not really comparable to prior years anymore. So again, on the previous slide, I did not pull forward 2021 performance to be really careful about making sure that we're only making comparisons where that makes sense to do. And then for the claims based measures, again, paper reporting, because there are no performance benchmarks. So I tried to call out all of the really key points for Medicare performance on those slides. I am going to pause to ask Chair Foster how you want to proceed. We can go through all payers and then have Q&A at the end or we can do it individually after each payer speaks. Do you have a preference? I don't unless you do. We typically go right through, but it's up to you. That's totally fine with me. Thank you. Okay. Yes. Then with that, I'm going to be joined by Amy and I believe Alicia from Diva. And Amy and Alicia, just let me know when you want me to advance slides for you. Hi. Thank you, Michelle for the record. Hi, I'm Amy Coonrod. I'm the Director of Operations for Accountable Care Organization Programs at the Department of Vermont Health Access. And I'm also joined by Alicia Cooper, who's the Director of Operations for Managed Care Programs at the Department of Vermont Health Access as well. And I, we are here to talk about the VMNG Programs 2022 performance. Next slide, Michelle. Thank you for driving, by the way. Just for a little bit of framing, and I think this slide looks familiar to a lot of folks, the VMNG Program is reinforced by Diva's priorities. Diva has three priority areas that the department focuses on and has been focusing on for, I think, the last seven years at this point. We're nothing, if not consistent. And two of them pertain to the VMNG Program. One of them is related to value-based payments. And this is a pretty great example of a value-based payment model. And one is related to performance. By implementing the VMNG Program, we are able to focus on Medicaid being a predictable and reliable payer partner. And we're also able to focus on continual and incremental programmatic improvements as we make changes year over year to the VMNG Program. Also, the Program gives Diva opportunities to align Medicaid with other similar payer programs in the state. And it also allows Diva to be an innovative leader and test new ideas that other payer programs may want to align with in future. One of those at this juncture is exploring new payment models related to the recently announced AHEAD model. But I will get to that in future slides. Next slide, please. As a refresher on where we are and how we got here, the original VMNG contract was signed way back in 2017. And it was a one-year agreement with four optional one-year extensions. Diva and OneCare Vermont triggered one-year extensions for each of those years, so 2018, 19, 20, and 21. And then also in 2021, Diva issued an RFP to continue contracting for ACO services for a 2022 performance year. And OneCare was the successful bidder during that RFP process. After that, Diva and OneCare negotiated a subsequent one-year contract with the possibility of three one-year extensions with the start date of January 1st, 2022. And Diva and OneCare are currently negotiating the second of those one-year extensions for a 2024 performance year. The PMPM rates that are associated with the program are renegotiated annually due to changes in the attributed cohort annually, typically. And reconciliation for the program currently occurs annually, but it could occur more frequently during the performance year. Should we identify and need to do that? Next slide. And yes, now we will dive into the 2022 specific performance for the program. This slide just shows, I think, the changing scope of the program year over year. It's stable. And it continues to be stable in terms of its size and scope as shown in this table here. As we can see, the program grew in the first four years of its existence in 2017 through 2020. And it leveled off in the 2021, 22, and 23 performance years, both in terms of the number of HSAs, health service areas participating, the number of Medicaid members attributed, and the number of unique Medicaid providers who are participating in the program. This combined with the use of an expanded attribution methodology beginning in 2020 and continuing until today, indicates the program has potentially reached scale for Medicaid and may not see much more significant growth in future performance years. As a reminder about that expanded attribution methodology, we implemented that in 2020, such that we would attribute Medicaid members to the program both if they have a demonstrated relationship with a primary care provider, which we call our traditional attribution cohort. And if they don't necessarily have a primary care provider, but they have a full Medicaid benefits package, which is our expanded attribution cohort. The program still continues to exclude members who have a primary care provider who's not participating in one care. I'll note that late breaking attribution was recently set for the 2024 performance year. And the program is seeing fewer members attributed through Medicaid, as we can see in that 18% decrease there. That likely has to do with the redetermination activity that's picked up at DEVA since the end of the public health emergency, which has resulted in the disenrollment of a number of Medicaid members who no longer meet eligibility criteria. Because one care provider network has remained stable between 23 and 24, we think that decrease is likely due to redetermination activity. And next slide please. This is a fun colorful one. Just to quickly review the details of the payment model that we have in the VMNG program. One primary characteristic of the payment arrangement that we have with one care is that we negotiate and agreed upon price for the attributed members for each VMNG contract year. This is illustrated in the green bar to the left, which is 100% of the total cost of care or the agreed upon price. Additionally, the arrangement includes a risk corridor, which is illustrated by the dotted lines that are red and green in there, whereby if one care spends between 100 and 102% of the agreed upon price or the space between that blue and red dotted line, they're liable to pay that money back to DEVA. But if they spend above 102%, they're only liable up to that first 2%. Conversely, if one care spends less than the agreed upon price or the area between the green and blue dotted lines, they and their provider network are entitled to retain the difference between actual performance and the full 100% of the agreed upon price for the first 2% of underspend, which creates an incentive to be efficient with resources within the risk corridor as outlined here. And next slide please. And then in terms of financial performance for the 2022 performance year, DEVA and one care agreed on the price of healthcare for attributed Medicaid members upfront and spending for ACO attributed members was approximately $12.1 million less than expected, which is the expected total cost of care of $285 million for the traditional attribution cohort and around $3.7 million less than expected on an expected total cost of care of approximately $45 million for the expanded attribution cohort, because the expanded cohort is still relatively new to one care, the traditional and expanded attribution cohorts had distinct risk arrangements and were reconciled separately in 22. Though I will note that those have since been combined into one cohort for the purposes of reconciliation and they have one risk corridor for 23 and 24, but we're not there yet. I'll talk about that next year. And after applying some necessary adjustments, DEVA will issue one care or reconciliation payment of approximately $11.8 million. And that includes the money within the risk corridor. And I don't have more detailed numbers on this slide to share about the whole financial reconciliation and how it shook up, but we would be happy to share that detail as well. It just gets very busy on a slide, so I didn't paste that here. Next slide. Thank you. And here is just another way of depicting graphically the ACO's financial performance for both attribution cohorts for the 2022 performance year. We do love our stacked bar charts in this model. The expected total cost of care is the total of all of the components of the bar graph with the yellow portion being the prospective payment that is issued to one care monthly for its attributed membership from DEVA. And the orange portion is the fee for service portion that DEVA retains and issues to providers on one care's behalf for those providers who may not be ready to be paid prospectively or providers who are outside of one care's network such as out of state institutions that have a relationship with Vermont Medicaid. And the gray part of the bar is the difference between what was spent and the agreed upon price or the expected total cost of care. And in this situation and this performance here is owed from DEVA to one care. And so on the left we see the traditional attribution cohort and on the right we see the expanded attribution cohort. And next slide. And here is that same graphic depiction of the expected total cost of care broken down by the prospective payment portion, the fee for service portion, and how actual performance shook out with the risk corridors for all years of the program to date. It is getting busy on that slide. I think this is just a great depiction of how the expected total cost of care broke down between the fixed prospective payment portion and the fee for service portion for all of the years. I think this is pretty similar to the slide that Lindsay spoke to on the Medicare portion of the presentation, but we generally see that the fixed prospective payment component of the expected total cost of care is a little over half for each of the performance years here. And I think that's good to note. And then also the slide shows that there have been years where one care was entitled to a payment from DEVA because they spent less than the agreed upon price, specifically in 2017, 2020, 2021, and 2022. And there are years here where the one care was liable to pay DEVA back for an amount in excess of the agreed upon price, which I believe were 2018 and 2019, yes. Next slide. And now we will run through the VMNG programs quality performance for the 2022 performance year. As a reminder, this measure set for 2022 contained 10 payment measures and three reporting measures, including the CAHPS survey. As a reminder, the VMNG program reverted back to pay for performance in 2022 after being paid for reporting due to COVID in previous performance years. So just to note that one care's providers earned a total of 13 out of 20 possible points in 2022 for quality performance yielding a quality score of 65%. This quality performance exceeded the national 90th percentile for three of the measures in the quality measure set, exceeded the 75th percentile for two measures in the set, exceeded the 50th percentile for three measures, exceeded the 25th percentile for one measure, and was below the 25th percentile for one measure as well. And then also based on quality performance, the year and quality adjustment for to the expected total cost of care for 2022 was just shy of a million dollars. Next slide, please. And this slide just gets harder and harder to read every year. And I apologize for that. So this is just the actual quality scorecard for the 2022 VMNG quality performance. The measure set has not changed very much since 2017 and 2018. The measure set's been pretty consistent, so I won't go through and read all of the measures here, but you can see both how the traditional cohort did on these measures, how the expanded cohort did on these measures, the benchmarks against which they were measured, and their 2021 performance. Excuse me. And you can also see in a color-coded way how they fell into the different percentiles that they fell into when they were scored against these measures. I'll note just really quickly that the traditional cohort was paid for performance on all measures, and the expanded cohort was paid for reporting or reporting only on the claims-based measures. And we don't have rates for the expanded attribution cohort for the clinical measures due to the way that they're attributed in that it would be difficult to do chart abstraction for samples for those clinical measures for the expanded attribution cohort. I'll generally say here that there were some noticeable improvements, excuse me, for the 30-day follow-up after discharge from the ED for alcohol and other drug-abuser dependence treatment, and that was significant, as well as a significant improvement in diabetes poor control. And there was also a statistically significant decrease in the engagement component of initiation and engagement of alcohol and other drug-abuser dependence treatment. That is a mouthful. And I will let folks squint at that at their leisure, but I will move us to the next slide. And in terms of future opportunities for the VMNG program, looking ahead, Diva remains committed to testing this model into future program years, and we're currently negotiating with OneCare on an amendment for the 2024 contract year. Diva is interested in continuing to use this model to innovate, especially as we look forward to the next multi-state model that has recently been offered by CMMI, the AHEAD model, and Vermont's potential participation in this model. Part of the new AHEAD model features a hospital global budget component and would require participating states' Medicaid agencies to offer hospital global budgets that align conceptually with Medicare hospital global budgets within the AHEAD model. As such, Diva and OneCare are currently working to develop and implement for 2024 a global payment program, which would convert a significant portion of hospitals and independent primary care providers, basically the providers who are currently being paid through a fixed perspective payment for Medicaid. It would convert the remaining fee for service, Medicaid revenue of those providers into fixed payments in a no risk model that would reconcile back to fee for service. So, this portion of the fixed payment would reconcile back to fee for service. It would allow these providers to test something that's global budget-like for one payer, Medicaid, before any requirements for participating in the AHEAD model with multiple payers in future years, should we decide to participate in the AHEAD model. Implementing the global payment program in 2024 would also give Medicaid valuable experience to learn about operational considerations as well as ascertaining the appropriate Medicaid authorities for implementing a more comprehensive global budget model, well in advance of a potential first year of the AHEAD model, again, should Vermont choose to participate in such. And that first year could potentially be in calendar year 2026. And that is all that I have at this juncture. And it sounds like we're holding questions until the end, so I will just stop there. And thank you very much for the opportunity to speak about this. Thanks. If we're ready to move on, I'll introduce myself briefly. I'm Andrew Garland, I'm the Vice President of Client Relations and External Affairs for Blue Cross Blue Shield of Vermont. My day job is generally in the sales and marketing space, but I still I still work on our ACO program and all of our our value-based care programming. So I'm going to talk a little bit about a relationship with OneCare and our 2022 results. And then my peer Grace Gilbert Davis will join us to talk about quality as she is much more knowledgeable about that than I have. So let's jump to the first slide. This is our healthcare reform philosophy. Those of you who have heard me do this presentation over the last five, six years have seen this, I think, in every deck. I draw your attention to those three bullets in the middle of the philosophy. Of course, you're encouraged to read it all when you have a moment, but improved clinical outcomes, reducing the cost of care for our members and maintaining an exemplary member experience. Those are the criteria that we put on the table for virtually everything that we do. And that's how we think about every healthcare reform proposal that comes across our desks. So you'll hear the rest of this presentation in light of those standards. I think we can jump to the next slide where we summarize, as we do every year, just what's going really well in our relationship with OneCare and the program and where we have challenges. You're aware that we have put our contract on hold for 2023, but we still had a productive year working with OneCare in 2022. We developed a post-COVID risk model that does promise a significant improvement in the shift of reimbursement from fee for service to incentive-based or value-based payments. And we're pretty excited about that model. I'll talk about the financial results briefly in a minute, but the model really is designed to correct some of the challenges that we have in the current risk model or the risk model that was in place in 2022. We also worked on a longer term approach to quality with OneCare in 2022. I'll talk in a moment about the quality at a very high level, but many of you have heard me say in the past that one of our challenges, and I think I see this in the data that we just saw from our other partners, that our quality results haven't tracked in any consistent way. One of our hopes is that by establishing a more long-term approach with sustained focus, we might start to establish some trends in the quality space where we see meaningful improvement that we can attribute to our work and expect to maintain that improvement as we move into the future. Challenges really, it's the same challenge that I've cited each year. A lot of good things happening in our work with OneCare, but still we're unable fundamentally to link anything that OneCare has worked on with financial or quality progress. In some cases, the financials look better. In some cases, they look worse. Same with the quality, but really what we aren't able to do is strongly correlate any of those changes with the work that we've done with OneCare under this program. Then of course, the new challenge, I think it's important to cite that briefly here. The reason we're on hold with our formal risk program with OneCare is that we have some data protection issues that we're still working through as they move systems to a new vendor or new host. I think we can jump ahead now. I believe there's a placeholder slide and we can go on to the financial results. I'll just remind everybody we established right at the beginning of the pandemic a new risk model or a truncated risk model for this arrangement. Knowing that the commercial market was very skeptical of a broad-based payment program like this and difficult to convince to participate, we felt that it was critically important that the ups and downs of the pandemic were not transmitted back to the market through a value-based care program that produced results that were simply nonsensical driven by COVID laws and COVID rushes and not really representing real progress. So for example, we suspected, and this is our ACO partners too, that in 2020 we would see a huge fall off in utilization as hospitals and practices have to really limit the flow of patients to maintain safety. We both knew that we could not go back to the market and say that we were making a big payout to one care in form of shared savings for that volume reduction, which was obviously not related to our work in the value-based care space. And frankly, we didn't have then and we still don't have now the methodological expertise, I think, to adequately sort of separate out what is COVID and sort of perverted utilization from that experience and what's actually attributable to the work that we do with one care. And I would just add that COVID impact continued into 2022 for sure. And frankly, I think we're continuing to see it in 2023. We've seen some historically apparent utilization this year. Our final settlement for the QHP population, the exchange population, did not result in any financial transfer between the two parties. Even if it had, I think we had limited the risk transfer to $50,000 or $75,000 for that line of business. Again, we wanted to stay in the program. We wanted to have a risk deal that we could report to CMS and CMMI in those years, but we didn't want any strange payouts. So we brought those corridors way in. There is a small payment that will come to us for the primary cohort from one care. That's the large group and ASO cohort. But again, from our perspective, not really meaningful. The QHP results are particularly difficult because of a methodology we had included in the old model for trying to understand how coding improvement impacts what we see in the risk picture as we evaluate the pools each year. But the sample size of our comparison population was just so small that the results were really suspect. I had mentioned before that we worked on a new methodology with One Care in 2022, and that's precisely one of the problems that we knew we needed to solve for the sustainability of the model going forward. So when we get back to it, we do have a new approach to that. In our backwards testing, we applied the new methodology to earlier experience in the model. I think we looked at 19 and 20 shows that it really does a much better job of modeling what happens. So we are excited about that. Finally, that last bullet, generally speaking, it's difficult to correlate cost with our traditional quality measures, the CAPS measures, the things that we that we measure for all of the accrediting agencies that we work with the correlations between those measures and costs are just not strong. But I think it's worth saying in this program, we couldn't establish any real correlations between positive movements or negative movements on the quality side and the financial results. And with that, we can jump to the next slide. This slide I won't attempt to talk to. I have presented it in the past. I'll just say that as Grace and her team have walked me through the quality measures this year and in past measures, I've been able to understand that there's just a degree of volatility here that makes it very difficult to say, yeah, that measure is really being impacted by this program or or it's not for the for better or for worse. So with that, I'm going to turn it over to Grace. She'll take you through these specific measures for the quality health program and show you large group measures. And I think it's the first time we'll be showing you that data. We finally have the numbers we need in the program or had in 2022 and some really good work by our data folks to set that out. Thank you, Andrew. Hi, folks. My name is Grace Gilbert Davis. I'm the corporate director for value based network development and quality improvement. And I'm pleased to be here today. So a couple of housekeeping items about the quality sides we'll review today, including this one reminder that as Andrew mentioned, due to COVID, the 2020 through 2022 agreements, the payment measures were reporting only. And in terms of our review today, the quality results that with with numerators and denominators that are less than four are not included because they sort of skew the the the findings. So those are the 30 day follow up after discharge or VD for alcohol and other drug dependence. We had a denominator of one numerator or two and 30 day follow up after discharge or VD for mental health denominator for numerator of four. As in prior years, you know, we see performance ups and downs when comparing 21 and 2022, you know, notable improvements include alcohol, other drug initiation and treatment and depression screening and follow up. Now the latter is a is a standing reporting only metric, but it's an important metric, we believe. Let's go into the next slide to understand the ACO's quality results over time. So this is a sort of complicated graph to read. But the previous slide that we just looked at only compared 21 and 22 quality results. This graph shows the ACO's impact on Blue Cross members between 2018 and 2022. And a reminder, you know, graph does not include the anything within denominator, numerator, or less than four. And we also have a separate slide for all cause admissions. So considerable resources were used during this time frame to impart, you know, meaningful change. But the trend for Blue Cross members is largely static for these for these metrics. And there's a few exceptions. You know, in 2019, A1C port control stood at an impressive 12%. But by 2022 had increased to 20%. This is an inverse measure, so less is best, right? Two other metrics realize the 10% decrease over the five years here, you know, more is best. So follow up after hospitalization for mental illness, the seven day rate and developmental screening in the first three years of life, which again is a standing reporting only measure, but an important measure. Next slide, please. So here we looked at that rather than comparing 21 and 22 and looking then over the five year trend, what was happening in 2018 when we first began working with the ACO? And how did that compare to the last year we worked with the ACO, which was 2022? So when we compare, you know, these quality results, we find that nearly half of the metrics did not improve. So those are the solid, solid blue columns. Okay. I should point out that there is an error in here that the diabetes column, so A1C port control, you know, greater than nine is actually an improvement less is best. And that is my error. And I apologize for that. Next slide, please. So as Andrew mentioned, for the last three years, we've been collecting data for large group in the past, you've only ever seen the results for QHP. So we're pleased to be able to have this comparison data as well. When compared with QHP, we see more metrics with performance decrease than improvement. Notables include, you know, follow up after hospitalization for mental illness a seven day rate, which fell from roughly 72% in 2021 to 47% in 2022. We did see a 17% improvement, however, for the 30 day follow up after emergency department visit for alcohol and other drug abuse or dependence. So again, you see this, this, you see volatility, you see that the results are largely static. But here and there, you know, there are some, some outliers, both positive and negative. Next slide, please. Again, a busy, busy graph, but it shows that we've trended over time, the large group largely mirrors QHP, results are largely static, few exceptions. The both of these graphs sort of, I call them the, the spider graphs just because they look like spiders. Next slide, please. Here again, we looked at what were the quality results for large group in 2020 when we first began collecting data and then in 2022. And similar comparison, the blue columns indicate quality metrics that improved in 2022 when compared to 2020. And here we see that six out of the nine metrics, all improvements in 2022, those would be the textured blue columns. Next slide, please. And then all cause admissions for QHP and large group, the positive downward trend beginning in 2019 and continuing through the COVID years or 2020, 2021-ish increased in 2022 for both QHP and large group. You know, I would, I would suggest that this is most likely a result of COVID and one would expect to see the downward trend continue for other commercial payers in 2021 and Blue Cross in 2023. Next slide, please. So we wanted to share a little bit about our other quality and total costs of care or value-based care work with you. You know, we continued to honor the all-pay-our-model in 2023 throughout this year by taking the $3 and 25 said PMPM that we were paying to one care who was then passing you through to the two primary care providers in their network. We continue to make those payments directly to the PCPs in 2023. In 2024, our plan is to reallocate those resources formally used for the ACO. And that includes not just the dollars, but the large number of staff who were very focused on the ACO relationship and agreement using those individuals and these dollars to focus on a growing portfolio of value-based care programs. And there are two that are notable. One is Vermont Blue Integrated Care or VBIC. I'm sure you, we've talked about this with agreement on care board before and with other groups, but this is an advanced primary care model that was created with the help of four primary care practices, most notably evergreen family practice. We're in year one of a two-year pilot, so 2024 will be our second year. Quality metrics include disease management and utilization metrics for reducing total cost of care. And then the program elements, we really are trying to look at what do our members need, what do Vermonters need? And so we are focusing on mental health substance use disorder services. We are definitely working with the VBIC practices as well as other entities to address collaborative care coordination, so that we reduce duplication of effort in case management. And then we're pretty proud of the advancements we've made in our data sharing and reporting, practice specific reporting, not HSA level reporting, but practice specific reporting for quality and total cost of care metrics. The second program is really new for 2024, and there'll be more information coming out about it in our response to the Green Mountain Care Board. But the bottom line is the association has created a database, very large database, where they will be able to generate scorecards using population and condition based metrics, as well as total cost of care measures. They're risk adjusted for health status of the provider's patients panel. So each provider will have a scorecard, and we will be reimbursing them the value based PMPM based on their scorecard. Now both of these programs have some similarities. They are ACO agnostic, meaning if a VBIC practice is part of the ACO, that's fine, that works. It works with an advanced primary care model like VBIC. And with a program like the Enhanced Community Primary Care, ECPC as we call it. Both of these programs are dedicated to independent community providers and the providers who work in FQHCs. And then as I noted, additional information will be forthcoming about both of these programs. And we have an alignment assessment due to you folks agree about a care board on December 1st. And we can go to the next slide. I forget if we're taking questions and answers now or we're going to wait till the end. If there's an appendix with the scorecards, please feel free to review this material. And yeah, we're open for questions and answers at any point. Thank you. Hello, I'm Karla Renders. I'm the Director of Network Management for Northern New York and Vermont here at MVP. I'm accompanied today by three of my colleagues. Scott Mamro, our Vice President of Network Strategy. Matthew McKinnon, Vice President of Network Management and Contracting. And Jordan Esti, our Senior Director of Government Affairs. So we can go to the first slide, please. So I always like to kick off presentations with MVP's mission statement and our guiding principles of improving health and providing peace of mind to our members. We do this through our core values, which are being the difference for our customers by making them feel reassured their healthcare needs will be met. We're curious as to their wants and needs and work to anticipate and address those needs for a better consumer experience. And finally, we aspire to be humble as humility allows us to keep an open mind and be receptive to innovative ideas from all of our constituents, be they employees, employer groups, members, providers, or partners such as OneCare. Next slide. Well, actually, you can go to the fourth slide, Michelle. Thank you. So we'll get right into the components of our 2022 arrangement. This is the third year of the MVP OneCare arrangement. And 22 was very similar to 2021 in terms of our contractual components. The program continues to cover commercial lives under qualified health plans. So those are individual and small group membership sold on the Vermont Exchange. 22 continued the upside only total cost of care shared savings arrangement with the amount of savings being subject to a quality gate. The quality metrics that we utilized for the quality gate were metrics selected from the all payer model and were the same as what was utilized in 2021. As part of the arrangement, MVP provides OneCare with eligibility claims and financial analytics for the attributed population on a monthly basis. And finally, MVP did continue in 2022 to provide a monthly primary care investment payment, which is then distributed to the downstream providers in support of the OneCare population health model. Okay, next slide. So looking at the, this is, this slide is looking at the results of the 2022 performance year. Although the financial results were not optimal for 2022, the final settlement deficit of 2.9% was far less than what we saw in 2021 when we had a 24% budget overage due to the ongoing COVID testing and treatment services. And of course the onslaught in general utilization after the long period of reduced utilization, which we saw in 2020. So what the bar graph is showing here is how OneCare performed in orange compared to the budget in blue by quarter and on a per member per month basis. There were around 8,900 members in the performance period, which actually was a reduction from 2021. The percentages that you see shown represent the percentage deficit for each quarter. So as you can see, we started the year with a budget overage of about 18%. And then by quarter four, it had come down quite a bit to a little bit under 3%. Because savings were not achieved, there was no distribution of dollars between MVP and OneCare. Okay, next slide. So why did we end up over budget in 2022? The contributors were as follows. First, compared to the market population risk score, which increased by 5% in 2022. The OneCare population risk score increased by 20%, meaning there were sicker patients in the measurement period as compared to the market population. Additionally, which does correspond to the risk score increase, we also saw increases in ancillary facility services, such as lab cancer therapies and imaging, and finally in outpatient surgeries. These were not of course the only contributors to the overage, but they were the most substantial. We can move on to quality. On slide eight, the 2022 program was again similar to 2021. The quality metrics were selected by OneCare using the standard all-payer model metrics. CMS 2021 benchmarks were utilized. We have a point system which determines the amount of shared savings due to OneCare. Since this year there were not any, not this year, but in 2022 there were not any shared savings. So that is not as relevant. It should be noted that three of the measures had such low member denominators that the points for those measures had to be redistributed. We only look at metrics that have 30 or more members and the denominator is we determine that to be statistically significant. So moving to the next slide, this is a depiction of the quality scorecard that we distribute to OneCare at the end of the settlement period. Again, the scorecard does play a role in the event that there were shared savings. Some more of the savings is shared based on quality performance. Again, this was not applicable in 2022. So describing what you see here, the quality scorecard is worth 100 points with each metric being worth around 12 points. Like I said, for three of the metrics, the points were redistributed because of the low denominator. So that was follow-up for alcohol and substance abuse after an ED visit, seven-day follow-up for mental health after hospitalization, and 30-day follow-up for mental health after an ED visit. Points are rewarded based on the scale that you see at the bottom of the slide. So zero points if below the percentiles achieved, 58% of the points are earned for the 50th percentile, 75% of the points for the 75th percentile, and all 20 points if the 90th percentile is reached. Again, this is not an optimal performance here for quality either, while controlling high blood pressure remains static compared to last year at the 50th percentile, as well as the HBA1C control also remaining at the 90th percentile. There was a drop in wellness visits from the 90th to the 75th percentile, and for all cause readmissions. The one care drop from the 90th to under the 50th percentile, hence resulting in an overall low overall performance score. Both MVP and one care have recognized that these are not the outcomes that we anticipated or that we feel are acceptable, but as we move into the next slide, we will review what we're doing differently in this year and in 2024 to remediate that. So as I just said in 2023, we entered into an agreement to address opportunities for improvement and continue to do so in our contract negotiations for 2024. In 2023 we did move to a true risk arrangement, so the parties will share in losses as well as savings, meaning we both have more skin in the game. We introduced a new metric to the scorecard which needed improvement in the MVP population, and that was colorectal cancer screenings, as well as kept the existing metrics one care selected that align across their arrangements. We of course continue to be committed to the ongoing conversations around global budgets to improve health care costs for Vermonters, and finally MVP has a new department within our organization called provider engagement, which is focused on arrangements such as these to to create to assure success to create an account plan, which is really a narrative on how the parties will collaborate to achieve the goal set in the contract, and that and the provider engagement account manager myself along with other stakeholders at one care have begun to meet monthly to review how the arrangement's going, you know, answer any questions on attribution financial statements or other issues as needed. And then moving on to our 2024 arrangement, which is still a negotiation, we are continuing the downside risk arrangement. But aside from the total cost of care component, we are building in opportunities for improvement on wellness visits and mental health screenings, as well as a separate quality program that is laser focused on just two metrics for performance improvement, as well as continuing the the metrics that you just saw in the quality scorecard. We're also involved in the valuable discussions to standardize the way we review and track social determinants of health that impact the overall health and well being of our Vermont members. And of course, that's more focused on government programs and not commercial products, but it's another point of collaboration with one care and other constituents in Vermont. And finally, we are going to see more MVP membership covered under the arrangement is in 2024 MVP did update its attribution platform this year, and we are showing prospectively almost double the Vermont membership attributed to this program. That's because we're looking at things like more recency of member activity and full scope of services, not just preventive visits and E&M. So that means more of the population will be benefiting hopefully from the arrangement in 2024. So that concludes MVP's presentation. Thank you. Good afternoon. Thank you to all the payers for what you have shared. I'm Carrie Wolfman. I'm the chief medical officer at one care Vermont. And I am joined today by two staff members on the quality and payment reform team at one care Derek Reigns, the director of payment reform is here and Jody Fry, who is our assistant director of the population health model integration. Next slide, please. So a lot of what is on this slide again, very busy as others have commented. A lot of what is here has already been shared, but we thought it would be a good picture snapshot across all of you payers from 2022 to compare progress. Where you see an NA, there is no applicable benchmark available. And where you see a green box, it means that payer is not tracking that metric. Some of the things I'll point out that again may have already been mentioned. One is that the best score across all payers is in diabetes control. And I think you probably already know that and heard that that is a little more than halfway down and it shows across all payers. We are in Vermont at the 90th percentile or greater for controlling diabetes, the A1C greater than nine metric. Another thing I want to point out on this slide is a metric that we have not been budging across all payers, which concerns me and I think should be a focus area is controlling high blood pressure. That is right under the diabetes metric. So as you can see, 2021 and 22 across the payers, we're only at the 50th, at the 70th percentile in Medicare and 50th across the other payers. Where we fall down the most are in some of the metrics related to alcohol and other drug abuse independence and engagement in treatment for the same. So less than a 25th percentile and at the 25th percentile as Amy already pointed out for the Medicaid population that is attributed to one care. I'll just pause a moment and let you look at this. I'm not going to go through each and every one of these and many of them have already been mentioned. I think the all cause readmissions line, maybe one more we should point to declining from 21 to 22 and Blue Cross, both for QHP and large group, as well as in MVP. So let's move ahead. We may want to come back to this if we have questions related. This slide shows our population health evolution from 23 to 24. So what levers does one care have to really create change related to these quality metrics? We have data and analytics that we share with our network and we provide financial incentives to work on certain areas. So this slide is showing the areas that we are incentivizing both this year and next year. And there is a little bit of change, which I will go through in a moment, but I want you to know that these are claims based measures except for the controlling hypertension metric in 24. The rest are claims based. These are standardized, not custom. So that's a change we are making intentionally. National benchmarks are being set as our targets. And we continue to make emphasis or emphasize focus areas that include wellness prevention, the management of chronic disease, ED utilization, and mental health treatment. And we are also sunsetting inverse measures. We've had a lot of feedback that when lower is better, that is not intuitive. People would like their metrics to be not inverse so that higher is better. We are retiring from 23 to 24. The very first metric listed is diabetes. We're retiring that because of the success that I just showed you on the last slide. We are in the hypertension metric moving from a custom measure this year to back to the controlling high blood pressure HEDIS measure for 24. And then the third line down, I want to point out the X's across all the boxes, all the continuum of care partners. In 24, we are asking them to work on follow up after emergency department visits for patients with multiple chronic conditions. And this also is a proxy for helping control the cost of care because we know that reducing or having follow up like this will reduce readmissions and visit back to the ED as well. So you can also see some of the other metrics that the payers have mentioned today already are built into our incentive program for 24, developmental screening, wellness visits, et cetera. Another point of emphasis is that we are adding incentives for people to work on the lowest performing metrics, the initiation of substance use disorder treatment and engagement in that. Next slide please. In selecting the areas of focus for the population health model in 2024, consideration was given to these things that are listed here. So what data do we have available? Because we want all of our work and our advice that we give to be data driven. The metrics are included in our payer contracts, as you've just heard. And aligning across the payers I think is very important as we move forward. We have not always been aligned, but wouldn't it be great if we can all work across the payers on the same metrics so that providers can focus on the areas for their patients where there is the most need. We are using our performance levels. So over time have we been doing in the recent past and currently, and we're using that to inform the population health model incentive areas for 24. We want there to be the ability to influence results built into this model. We want there to not be too much provider burden. So again, alignment as a primary care physician, when I'm asked to work on a multitude of different measures dependent on XYZ payer, it's confusing and often there is a lack of engagement in focusing on any of them. So the more we can align the better. We like standard measures because we like to explain the definition and have the specifications and not have to think hard about what the measures mean. The measures need to be applicable across populations, again, payers and also our continuum of care partners. The measures need to be meaningful to all of those in the state of Vermont who are providing care to our patients. And really importantly, we have received feedback from our providers and used that feedback to inform our incentivized areas going forward. The percentile targets in the population health model are chosen based on national benchmarks and they are set relative to current performance levels. So pushing us to improve over time. The selections have also been corroborated by our benchmarking and our evaluation report outcomes. And for you members of the Green Mountain Care Board, you've already heard me say this that we have met with some similar ACOs who are performing better than we are in certain areas like patient experience, ED utilization and other primary care areas and are learning from them about some of the ways that we can support our network in improving in these areas. While we're here, I'll just point out something that I think you all know, we don't provide care. Our ACO does not provide care. We are on 20 plus different EHRs, which complicates all of this work. And also I want it to be emphasized that providers don't distinguish between ACO attributed patients and non ACO attributed patients. So if I have a patient who is not controlled with their blood pressure, I don't usually know if they're an ACO patient or not and I treat them the same way. And so when we're looking at these quality slides, I think it would be very smart if we could compare quality work on the attributed lives as we have today and compare that to what's going on with the unattributed population. And as somebody pointed out, the risk stratification of each of those groups also is really important to know. So are the things that have been shown in the slides before really impacted by the ACO? Or if we looked at the unattributed lives, would the numbers be the same? The last slide I have is just an example of when we meet with our network, what we're showing them to try to incentivize them to work on these areas. So we're meeting with them on a regular basis down to the practice level. We provide this information on a regular quarterly basis. We give them lots of other time when we're not meeting. They have access to us to ask us questions about their work and what they might need to do to improve in these areas that we're incentivizing. We also think it's very important to show them what money is on the table, what they may have earned already, and what they could still earn if they would improve in the metrics that we are incentivizing. So those were the slides and comments that I have for today. I had one more that I added in because of the discussion we've had today, and that is on the CAPS measures. Patient experience is a very difficult thing to measure. And as a provider, I can let you know providers don't like CAPS assessments at all. Don't think they are a reliable picture of what's happening at least at the primary care level. And some of the conversations with the higher performing ACOs in this area in particular have given us some good ideas about care navigators for patients, continuing to work on team-based care so that we have wraparound services for the patients. Access is a big issue when it comes to patient experience, so is communication. And then another big area, education was mentioned earlier, but medication reconciliation is one of the top most important areas where we could improve all across the state, across the nation for that matter when it comes to patient experience. And so things like having a pharmacist on the team that can meet with patients and do medication reconciliation is another idea that we have learned from one of our ACO conversations. So happy to take any questions. Thank you for listening to my comments. Thank you. Ms. Tigray, did you have anything else or should we go to board member questions and comments? No, I think we can turn right to board questions and comments. Okay, all right. I'll open it up to any board members that may have questions or comments. I will keep the slides up if you want to refer to a certain slide. I know that the numbering is odd because of the way that the presentation was combined, so just let me know where to navigate to. Thank you, Michelle. Question. So I'll go ahead and jump in. I appreciated, Andrew, your commentary around sort of the COVID influences of 2020 through 2022. And certainly, I think that it's easy to kind of think through some of that on the fight, maybe not easy, but you can certainly see how that impact, the care pattern disruptions impact the financial component. I'm wondering if, and this can be really to anyone, if anyone would comment on what they think the potential impact is in relationship to the quality measures. Yeah. I'll defer to the quality people, but great question. And intuitively, it's had to have had a huge impact. Access has just been so skewed. And folks have just had other things to contend with. Yeah. Great question. Robin, this is Grace. I'll just add that if you look at the graphs for QHP and large group during the core, right there is one, during the core COVID years, you can definitely see that we saw a decrease in the quality scores overall. But what's telling is that, again, if you look at 2018, when Blue Cross started with OneCare in 2022, our last agreement period, there doesn't seem to be any change. And maybe that's a result, you know, a lingering result of COVID on quality. I guess time will tell, right? Thanks. I appreciate it. And if anyone else wants to chime in with any other thoughts, I'd love to hear it, but I know it's somewhat of an unknown still. So, but thank you for your thoughts. Oh, and actually, I had one other question for MVP. So I was curious why you don't risk adjust the benchmark, the ACO benchmark, given that you saw such a difference in risk between the overall QHP population and the ACO population? We do. There is a risk adjustment factor applied. So there is. Okay. But it just, it sounds like if this was contributing that maybe that risk adjustment factor didn't really, right? Yeah. Yeah. Okay. Great. Thanks for, thanks for that. I could pop in with a couple of questions. Michelle, a quick question on the CAPS scoring. Is this a national comparison group of all Medicare, or is this an ACO attributed comparison group? What's the CAPS comparison group that we're comparing to here? I will triple check. What do you know? My answer is MSSPACOs. Okay. Okay. And then, and then do you know the, the definition of the stewardship of patient resources score? I appreciate it because I was, when you mentioned you had at your fingertips, I was looking forward to this. I do. So stewardship of patient resources is actually just one question. And it is in the last six months, did you and anyone on your healthcare team talk about how much your prescription medicine cost? Oh, okay. And then, I had a question for the DEVA folks on the fee for service component. You had the graph there where there's the fixed perspective payment, the fee for service payment. And I was trying to understand if, do we have any further breakdown of the fee for service payment? If that's out of state, if it's people in state getting care at non one care providers? Or, and the other question I have is Brattleboro in the fee for service camp or the fixed perspective payment at Camp Brattleboro retreat admissions? Yeah. So we do have a breakdown of what we call the one care in network fee for service component of the orange part of the stacked bar chart, the fee for service, and the out of one care network component. We don't further break that down into what's in and out of state or DEVA's Medicaid network. But I can get that detail to you if you're interested in that. And also, the retreat is not included in the total cost of care. Oh, that's out of the total cost of care that we see here? Yeah. I believe that a lot of the retreat spend is from a Department of Mental Health fund source rather than a DEVA fund source at this juncture. And so it's excluded from the total cost of care. Okay. And then the other question that I have, and I think I probably know the answer on this is, do you know if people who receive drug and alcohol, counseling and treatment at places like the various recovery coach agencies in the state that are not submitting a bill for that? Are they getting captured in the reporting quality reporting? If it's not for measures where claims are captured, I would have to get back to you on that as well. I do know that for some of the Medicaid measures, additional sources of information are used in addition to claims for some of those things that are not claims-based. I don't think we adjust it in the VMNG program. I think DEVA does for some of its measures, but we just take the raw numerators and denominators based on claims in the VMNG program. Okay. Just from a clinical standpoint, I would say the majority of patients that I see in the emergency department who have follow-up for drug and alcohol counseling have counseling with recovery coach agencies like turning point or something. And I don't, I believe they do not bill or submit a claim. So I think there's probably, that we may be performing better than we think there, because I think a lot of people are getting care outside of the confines of the claims-based system. So I will speak to that also through the way that that's currently measured, and I'll speak specifically to Medicare, but also to the way that we calculated in all-fare model reporting. It is a HEDIS-based measure, so it follows HEDIS specifications. We do recognize that there are other avenues by which someone in the state of Vermont could receive and very likely does receive different types of follow-up treatment. And as Amy mentioned, Medicaid has done some work to propose changes to HEDIS specifications for the Vermont population, but to the extent that those would be included in any of the federal reporting, it's not for Medicare or for the state's federal reporting, that's not currently happening. Okay, thank you. Just a few other questions in the Blue Cross presentation. You mentioned that you're seeing historical aberrant utilization right now in certain areas. Could you speak to areas that you're seeing aberrant utilization currently? Oh, I think you're on mute, so. Thank you. I'm so sorry. I'm afraid that I cannot. I do have a few other folks on the team. I tend to see the roll-up numbers, and I'm just aware that we've had several months, several times this year, with just historically high utilization and cost, just really higher than anything we've seen in a long time. I don't know if anybody else on the call from Blue Cross can speak a little bit more about the details. I do know that some of it has been driven by very high cost claims, which have been unusually high, but I think there's more going on there than that. Andrew, it's great. I can't speak to the detail either, but if there is no one else from Blue Cross on the call who can, Dr. Ramin, we're happy to get that follow-up to you. If you could give us a week or so, we're just having some internet issues right now. Right, I heard. All right, thank you. Thanks. I'm glad you're here, given your internet issues right now as well. And then, for Dr. Wolfman, you brought up some interesting questions regarding measurements. And you had mentioned access as an area to think about measuring in the future. And I was just curious if you had any thoughts about rational ways that we could measure access. Thank you, Dr. Ramin. I brought access up when I was talking about the patient experience questionnaire and commented that what patients want is access to care when they want it, where they want it right now. So I think that affects the patient experience responses. But more importantly, what can we do and what are we trying to do to help with access to care? I think the answer to that is we have built that into our population health model by incentivizing certain types of appointments, follow-up after ED visits, follow-up after hospitalizations within seven days, annual wellness visits, mental health follow-up, et cetera. So I think the best, you know, I can't tell practices, you need to add two more appointments per day. I can tell myself that and I can add telehealth at night. You know, we have ideas, but you can't make people do these things. But we can incentivize them. That's one of our levers. We can give them the data and analytics showing how they're performing on these measures. And then we can incentivize them in different ways to perform better by getting their patients into the type of appointments that are needed and also desired by the patients. Thanks for that. Yeah. And thank you, everybody. This was a really interesting presentation. Mr. Reigns, do you have a comment? Yes. Thank you, Chair Foster. I appreciate it. Just one other thing that I wanted to add for the benefit of the board. One other area that we're really digging into the access issue is within the CPR program. We've made it a requirement of the program under the policy that the practices work with us on a plan to improve access in the state of Vermont. It's a huge undertaking, obviously. So we've started leaning into that body of work and discussing it with the CPR practices. Given that they're in the thick of the access problem, it can be a little challenging asking them to help us solve it, as I'm sure you can understand. But we have started to undertake that work. So I think that that's another area that's worth highlighting in response to that specific question. If I could one follow up. Why the CPR program? I'm sorry. I don't know that I fully understand the question. You said that you're focusing on the CPR program. They must increase access and work with one care on addressing the access issues. And I guess my question is why limited? Why the CPR program and not the broader network? Good question. Sorry. I just didn't fully understand what you meant. So I think we generally consider the CPR practice or the CPR practices of the program in general to be a bit of a primary care incubator of sorts in the sense that we're paying a fixed payment. It does represent a benefit above and beyond fee for service, as you know from previous explanations of the program. So I think it's more productive when we work with providers that are in that sort of an environment as opposed to working with providers who are in more of maybe say a traditional fee for service environment. Thank you. Member Holmes or Member Walsh, do you have any other questions or comments? Sure. This is Tom, if that's okay. Jess, do you mind? No real questions. I appreciate everybody coming and I appreciate the information that's been shared. I also appreciate that earlier in the fall during the budget hearing there was a dramatic change in the description of the work that the ACO would be doing and these data predate any of those changes. I think I'm just going to be blunt about observations from these data. The data that have been presented don't allow me and evidently Mr. Garland either to have any sense of what the ACO has done to change the way that care is delivered or change savings. They've been saving some year in some payers but not consistently. Quality measures have gone up and down in different payers in different years like a scattershot and there hasn't been any material presented to me over the last two years where I can say that here's a data point, here's an action that was taken meant to change that data point and here's the change good or bad that occurred. There's no line of causality that's disturbing. We're seven or eight years into an expensive experiment and I don't think there's any way to know whether it's worked at all. It's tempting to conclude that the quality measures have gotten worse. I'm reluctant to draw that conclusion because causality is not possible to determine. I think the quality measure scores declining are a symptom of our healthcare system being under stress. I think the changes that we're seeing in quality measures and healthcare spending we are seeing while we have an accountable care organization in the state not because we have an accountable care organization and it's frustrating that we're seven or so years in and we can't draw a line to what's been happening. I'm still not sure what to do about that but I find it frustrating and those are my observations. One more thing that I find a little I find also tough to reconcile. The CPR and PMH payments have been going to providers and systems prospectively and in addition to what would be expected for fee-for-service as the gentleman was just saying a few moments before those payments were meant to help organizations transform from fee-for-service to alternative payment models. One way of assessing whether an organization has transformed is its risk tolerance. The acceptance of more risk would be evidence that the organization has readied itself to move from fee-for-service to alternative payment models and we've not seen any substantial change in risk bearing. So it's hard to say that there's been transformation. So those are my observations and I'll pass it back to you chair. Chair Foster I can go unless you want to go first. Please go ahead. Okay I think that Dr. Wolfman raised a good point about it's really meaningful to compare savings and quality changes between the attributed population and the non-attributed population. And so and I also think there have been enrollment changes and risk changes over this time period. So I guess part of one of my questions is are any of the payers doing an assessment of you know the attributed population to the non-attributed population to see whether these same quality changes ups and downs are happening in the non-attributed population. And similarly is has anybody any of the payers taken a look at the continuously enrolled population? So to see you know I was an attributed member in 2018 and now I'm an attributed member and still in 2022 and what's happened to the total cost of care for that patient population and the quality for that patient population. So I'm just throwing that out there as a I think to some degree would help answer member Walsh's question about can we it's still not going to be you know clean on causality but it will get rid of some of the confounds. So I'm just wondering if the payers have done that or what we can learn from that type of an analysis if it's been done. Well that's a tough set of questions but I'll give a start. So great observations there are so many factors impacting these populations besides what's happening with the one care or has happened with one care that that is unquestionably so especially the QHP. The membership churn in that population is much closer to what we're probably used to with a Medicaid population than with a traditional commercial population you know even with members kind of coming off and back on the roles in the same year as finances change. As you also noted we have programming aimed at cost and quality a lot of others do too who aren't a part of this this conversation or this relationship. In the past we have done the commercial non-attributed versus the commercial attributed analysis and actually what we found was counterintuitive that the non-attributed population did better than the attributed population um pretty consistently. Now I don't believe we've updated that analysis in a year or two it's pretty time consuming um and just given where we are um it hasn't been at the top of the heap. The other thing I would say about the quality measures and I'll invite Grace to say more about this I'm sure that ours have some um up and down randomness to them too. Again there's a lot of things that happen in the population that can affect that um a key difference though is that when we run quality projects or quality initiatives at Blue Cross you know we start with a pretty strong hypothesis about what we're going to impact and we do everything we can to try to trace that impact to the quality measures that we we then record later um that's not a perfect process I'm sure our data folks could tell us all the variables that are hard to control in that analysis um but we hold ourselves accountable to that and we can generally see yeah the work we did impacted the population in the way that we expected it to um so I think that's a partial answer to your question I hope that helps somewhat and I'll I'll defer to Grace yeah really good questions we actually had started the update um to the um the assessment of attributed versus non-attributed in terms of the quality metrics and as um Andrew said it is a huge lift we are close to having it done we are happy to share it with you um but just from the preliminary results that I saw in other words for to the point ever we want to we need to double check right the the data points but just from the preliminary um view it it is exactly what Andrew described the the non-attributed cohort is generally outperforming the attributed cohort do we know why no I suspect that would take quite a while to figure it figure that one out yes but I will say this in terms of the attributed members again from the cost there has been a considerable investment toward improving the quality metrics for the attributed lives um above and beyond what we as an organization invest in you know helping our members to have you know the best the best care that they could get so I would have expected that because of that you know 12 million plus dollar investment we would have seen better results in the attributed cohort yeah um thanks Grace we should just add one more quick uh clarification you asked about the continuously enrolled population we have analyzed that uh I wish martin was on this call to give us the details I have not looked at that in a while I do recall that that population is getting smaller and smaller and we even worked with one care to try to use the trend that we're finding in that population to inform the risk model um so I'll I'll take it as a follow-up to connect back with martin and summarize what we've learned through that analysis and I'll I'll share that back with you that sounds great thank you I don't know if mvp has any thoughts on this as well yeah I was just gonna interject that while mvp has not done an analysis on quality for the one care population compared to um the market population we did um for the financials and the outcome was actually better in the one care population than for the market population as a matter of fact we were applying a market trend factor um to the the budgeted um target quarterly and one care did outperform however again it's hard to trace that back to something in particular in terms of work one care is has done or interventions they've made um additionally the attributed population this year isn't um the best representation of what the attributed population should be for mvp as we had some inherent issues with our attribution that we've corrected and we've also updated our attribution methodology in general and we're having much better outcomes with it so we'll have to see how things trend in 23 thank you that's great I mean I think going forward this would be really helpful types of analyses to see if that's something that you know going forward we can you can do another thing I guess for me personally looking at the the slides and seeing the deltas between 2021 and 2022 and up arrows and down arrows it would be really helpful to assign statistical significance to those changes because as you look at some of those changes and it's counted as an up or as a down and you look just eyeballing it to me it looks more flat than up or down for some of them some of them definitely appear like I can imagine they might be statistically significant changes but I think just from you know a naked eye look and I understand there's small denominators as well I think some of the reports to me um you know when there's a denominator of four or one to me that's rather meaningless and I know that you know some of that is subtracted from the analysis as well if the denominator is really small but it would just really I think help to shape our understanding of what's what's really changing by statistical significance of those changes so those are you know two kind of observations there or questions there I guess I would just ask if there's any insights that the the results that struck me um were involved I mean a lot of things struck me to be fair but the the two that I thought I would ask about because it seemed so interesting and troubling was the all-cause readmissions if anybody has any insights into what is happening there and then in terms of the 30-day follow-up after discharge from the ED for mental health and then there was the similar follow-up after a mental health admission in some of the other payer data it's all over the place I mean so I just thought what is happening there to Dr. Wolfman's point everybody is treated similarly but we're seeing a huge increase in follow-up for Medicare patients or Medicaid patients um after a mental health ED visit and a huge reduction after a medic for a Medicare patient the swings seem rather wild and I'm just trying to understand that and then the the uh unfortunate to you know decline in the in the quality metric related to all-cause readmissions both of those struck me as worthy of asking questions about if anybody has any insights or intel that might explain some of the wide variations on the mental health and then the all-cause readmissions and maybe that's a one-care question because I would ask anybody I can go ahead I did have my hand up for another point that I wanted to make really quickly which is if people start doing the comparison between the attributed and unattributed groups let's remember to risk stratify them clinically this is using risk a different way and include the social determinants of health in that total all-person risk because that is a big influencer and then to this question I think we'll need to research some of the payer-to-payer variation we have some some more recent data in I cannot answer why for Blue Cross and MVP the all-cause readmissions um success was worse actually in 22 than 21 I do think that we we still cannot explain all of the impact that the pandemic has had and actually continues to have so that may be a factor when it comes to the the good results on follow-up for the Medicaid population after a need to visit for mental health most likely that is due to concerted effort by certain groups and again that is a claims-based result so we'd have to do some more research but I can get back to you on that I know you know you we improve what we focus on if we don't focus on it we usually don't impact it quite as much yeah great thank you so much I appreciate that I guess my final question is for you Andrew with respect to you mentioned the new challenge as you know absent systemic protections for Blue Cross Bishop member data the transition of ACO data operations to UVM health network remains problematic and so I guess my question there would be what systemic protections would you like to see uh in place in that in the you know in that transition yeah um I don't have any of our attorneys on the call who worked on this issue this is pretty specialized stuff but we do have a pretty good memo together um that we sent to one care last year sort of detailing the issues that need to be resolved so I think we could uh redact that memo and send you the sort of the key points um but but basically you know we're we're in a situation where the data is being moved to an organization that we work with in multiple levels um and we just have to be sure that when they're accessing the data um that it's being used strictly for the purpose that it's transmitted for right solely for ACO operations and reporting and that it doesn't find its way into other uses that we uh that that as I said we have a multifaceted relationship with with UVMHN and and we just need to make sure we keep the lanes clear well thank you I appreciate that and I think it would just be helpful for us to understand what would be the protection to that you know you'd like to see in place yep it's very clear cut and I think all of it comes from rule or ncqa standard or HIPAA standard so um it's it's pretty standard stuff great thank you that's all I have chair foster thank you thank you um a couple questions I had are a little similar so if they are a little repetitive I apologize um but it seems as though in some years there's pretty good savings for one payer on the same year there's no savings or minimal savings or even losses for a different payer and I was trying to understand that what explains that I could maybe take a stab at that sure so the question is why in a given year we might do well on a total cost of care program with one payer and not with another payer correct yeah so so I would say probably the most like the reason that comes to mind most immediately is just the fact that we uh that we use different processes for setting the targets with the different payers and they're based on different cohorts of patients so they present very different base years um there are different factors that come into play in in negotiating the performance of your target um so for example the process with Medicaid is very different than that with Medicare as you know um so I think all these different factors just come into play to result in targets that are very unlikely just by the the nature of probability to perform consistently how do we think about and segregate out sort of the math problem or the math that's being done to set targets from um interventions or one care programmatic efforts that result in savings I think that's kind of a tricky question for me to answer myself but I can tell you what my particular corner of one care is trying to do to address that I think I can get to that more directly so um as you can probably imagine when we work within the confines of one year programs it can be difficult to really isolate factors that are driving performance year over year because from one year to the next year you know things can be very different at a given facility or otherwise you know just based on natural factors um but a lot of what we're trying to do now is give the network a better look at what's changing from the base year to the performance year so there's a lot of focus on that in our hsa consultations we're starting to report on it more widely uh in other in other forums um and and I think that you know that work was something that one care was doing before the pandemic and is now kind of returning to now that now that we have base year and performance year data that's not so marred by pandemic patterns and and and things that that really dictate sort of how providers had to react to the market rather than sort of like trying to trying to you know be the owners of their own behavior so to speak. Oh no please go ahead add one point to that I think just sort of that is a really nice placeholder for just a reminder that Lindsay and I will be talking to the board about the Medicare benchmark coming up in a couple of weeks and it really ties in well to the question that you're asking here about how those are set and how risk corridors can impact sort of that potential savings or losses earned at least specifically in the Medicare program. And if I can just add one other factor chair foster that that's that's worthy of consideration is when we think about the populations within these payer programs so thinking like Medicare versus Medicaid versus commercial um very different presentation of of social risks social determinants of health and other risk factors um that they just they they have very disparate impacts on sort of how these programs come to play out so you know the Medicare population is plagued by different problems than the Medicaid population and so I think that that also factors into some of that variability. Wouldn't those go into how you set the target though? Sure that that is a big part of it I mean but the targets are an imperfect animal in that way I mean they they they certainly work at a macro level but in a micro way when you get into any specific you know issue or problem things like that can start to break down. Okay kind of a related question it looked like from some of the data we saw on quality um in particular the blue cross I think Miss Gilbert Davis called it um a spider graph. You do see quality performance go up then down and it actually kind of all goes in tandem up and down generally speaking. I was wondering if there's any thoughts from yeah this one yeah that's great thank you the one care folks. What why is this? What explains the fact that it goes up one year and down one year and it all kind of moves together? This is from large group I would also um have one care comment on QHP which is right there there. Yeah same same thing yeah. So I I think what I'm seeing here when I look at this is the spider effect is driven more by variation in performance across different measures than necessarily within a specific measure so like not a ton of variation for example year to year within child and adolescent well care visits but you can see that the performance you know for example between child and adolescent well care visits and diabetes mellitus management obviously very different different results on on this this graph so I think that's more where the spider effect comes in but if you look at the year over year performance within given measures it tends to run relatively consistently. I agree I think you have to really pick apart how this graph is designed to understand the ups and downs. If I if I may um so for any any one point on this graph for example um if we look at child and adolescent well care visits um the variability from one year to the next for that visit is seen by the color of the dots right the the down that I think the the choice of using the spider of connecting the lines gets confusing because the performance isn't connected year across measures year to year it's one one measure multiple years right so child care um at its highest point the gold color dot is 2021 or the yellow and then the lowest which is kind of the bronze color is 2019 so that's the year to year variability and this gets to professor Holmes's point earlier about statistical significance um when I look at these charts and admittedly they're hard to to understand but when I look at them um I don't see much change for any measure year over year there's no substantial change there's no statistical test that's been done to let us know if there's a statistical difference um if the sample size is large enough even a small difference could be statistically significant but when I look at these dots there's no substantial change in any of these measures yeah that I'll just um briefly add we have marked these uh slides in the past not not this slide but four significant statistical significance and we can go back and do that again um that that's not a hard add and I we have seen statistically significant movement I know I've reported that to you in the past um what we haven't seen is um sustained significant or statistically significant movement so a measure that moves statistically one year um up might might drop back down again the next year or yeah um so but we can we can absolutely add that data we might I think we get into pretty small numbers on some of these but we'll we'll mark those up for you and send them back if I could just add just oh sorry go ahead tell me it's way I like statistics and I like statistical significance to for implication but I think it's also just it's important that we think of overall meaningfulness as a clinician we think of what's a clinically meaningful change or the minimal clinically different score that we we need to see and of course this isn't this is policy not clinical so in addition to wanting to know if there's statistical significance I also just need some face validity that it's substantial and these are not substantial changes um go ahead Mr. Reigns and then I think member Holmes had something yeah one other really small point that I wanted to add to this that I think kind of adds a little bit to the variability especially in the context of statistical significance is that these are different cohorts so you're going to see some variability in the results from year to year given the given that you're measuring different sets of people so and the provider network also changes year to year to some extent not not greatly but it can change so there are other reasons for that variability too but I chair foster I know you're you're in the middle um if I could just address a point that was made a few moments ago about the difficulty of of answering questions like why so much savings in one area and not savings in another and the flip from year to year some of the responses that we heard were that well there's a lot of churn and the patients that are in the cohort that are attributed there are natural changes and factors there are externalities and so that makes it difficult to predict what will happen and I understand all that to be true but the point of alternative payment models and health care reform writ large is to overcome those limitations right so that we can create policies and interventions that improve the health care experience for patients improve outcomes that matter to patients and are less wasteful then fee for service has proven to be over the last 50 years and if we're saying that those factors don't allow what we are currently trying to do those externalities those natural changes and the churn don't allow us to predict what's happening that in essence is saying that what we're doing is not working let's just be clear about that if we can't overcome those things then what we're trying to do is not working those things are always going to be there we would need to overcome those I think I'm going to member Holmes you had something and then I had to I think just one or two more questions myself and I see that Dr. Wolfman has her hands raised but Jess do you have your okay Dr. Wolfman did you have anything yes thank you um chair foster just in response to what Tom just said I agree I think as a practicing clinician in primary care we have been lax about evaluating social determinants of health and going forward that is a health equity and determining the social needs of Vermonters will play a bigger and bigger role needs to play a bigger and bigger role in this work and um I'm happy to say that lots of people came together on October 27th to talk about this and how we can align across the state so many of the people who are presenting today were there or representatives from the payers were there we hope to have another meeting next on January 5th but building this into our work is one way that we can help all Vermonters and get the care that they need get access to care in the right place at the right time and get what they need to be healthier so I think that is is a huge factor and I'm glad to say that we are working on that I think we cannot underplay the need to align on the quality work and in that area we need buy-in from people who are providing the clinical care so providers need to jump on board more than they have and I think those are some of the factors um why we haven't made more progress thank thank you care I agree with your comments there um I have one small question than one other one um on the cap surveys I thought someone said that they we dropped doing the phone calls um to do the cap surveys I'm just curious why or if that two questions why and then did it sort of limit the volume Medicare specifically I can't speak to whether or not one care did that for the remain the rest of their payers they certainly could um and last year we did see a decrease in response rate based on that choice which is when it was initiated for Medicare specifically I can do some research and get back to you on that with more information I am not sure of the answer um and then the only other question had is I'm not as close to this as everyone else and you guys have been doing this a lot longer but if I'm trying to get at the causality which I understand is massively difficult and complex um but should we as a regulator be looking to see the quality scores improving and if they are should that correlate to savings so what I'm saying is if quality's flat and we have savings in one year and not in another year but quality's kind of flat along the whole period of time does that sort of indicate to us um that it might be more of a how we're calculating total cost of care issue as opposed to the interventions that we're doing or the incentives that were that we're using and then you know I I should be much more skilled at asking questions um given that I spend most of my life doing it so I'll try it again because that was a horrible question and no judge would ever allow it so I'll try again um but should we expect to see quality scores move in tandem with savings I think that's a really difficult question to answer in a general way um I think generally speaking yes you would think that as quality scores improve cost would also improve but I don't know that you could necessarily draw the conclusion that the two are directly correlated all the time. Mr. Garland did you have a thought? Yeah thanks I'm going to answer from more of a principled position than a expert position and I think the answer is absolutely but quality should outpace cost at the very minimum we should be seeing the quality improvement we should be seeing the quality improvement clearly where we are putting in effort to improve the quality. I believe that savings will be slower to come because this is a complicated financial system with so many revenue opportunities it tends to be I believe a bit self-healing on the revenue side um so if revenues fall because quality improvement leads to less costly specialty visits revenue-driven organizations are going to manage to their revenue targets by finding those dollars somewhere else. Generally speaking then I think the financial performance is is always going to lag a bit behind the quality metrics but if we're not seeing the quality improvement I don't know what financial improvement would tell us and I would be concerned about about that state of affairs um you know I don't think we've seen that in Vermont but in other parts of the country during the HMO movement um there were certainly places where quality did not improve got worse but cost went way down um that was not good for patients um and it's not what we want here in Vermont. Right good point um a couple hands came up so um I'll go member Walsh and then Mr. Reigns and I believe Ms. Kill uh had so member Walsh and I think you're muted Tom. Thank you um Andrew hinted at the point in the last part of his comment in the managed care era we found that it's very possible for any provider system to dramatically reduce their expenses to Medicare or to any payer just by refusing to see people or refusing to see sick patients. Right and so when ACOs started and we talked about bending the cost curve we knew that any organization could achieve a savings if they set their mind to it by refusing to see sick patients and so the quality measures in an ACO are there to make sure that does not happen you have to achieve certain quality benchmarks in order to receive back the savings they unlock the savings if you don't achieve the quality you don't get the savings they're there as a check to make sure there's not undue rationing so the fact that they're an organization could have moderate to good quality it could be flat over time and they achieve savings some year or they don't achieve depending on how much their volume of care but you've got to achieve quality benchmarks in order to unlock the savings that you get back to you that's that's why quality measures are there. Mr. Reigns. Yeah I was just going to add uh Chair Foster that may be part of why some of that alignment between cost reduction and quality improvement isn't so apparent is because we function in one-year programs with one-year targets and oftentimes within a given year quality improvement might involve increased cost you might have to spend more to improve quality at least initially and then another point that I would make is that you know sometimes the investments in quality improvement don't sugar out within that program here so I mean these are multi-year investments in patient health so I mean things that things that improve quality over multi-year periods don't necessarily appear within the given performance year so sometimes that correlation is a little difficult to draw I certainly agree with the points that were made by others though that that there should be a direct correlation between the two it's just not always easy to to to clearly identify that correlation within the confines of a one-year program. Yeah I mean my question we're getting at the over time it looks pretty flat 2018 to 2022 whereas the savings kind of go up and down but the quality is relatively flat. Yeah and I think that that comes back to sort of that that slow burn over quality versus the one-year program target which can be sort of variable year-to-year sort of as we covered. Okay great well thank you um any other board member questions or comments? Oh sorry miss Kil. Sorry I just I put my hand down because I was like oh he knows so just two things that I wanted to bring up to address your question about the differing trends between financial performance and quality so the ACO's model is a provider-based model meaning that it's where people are prospectively aligned based on their engagement with their provider so and that provider doesn't necessarily have to be in Vermont and that provider doesn't necessarily have to be in their area and the way that we measure quality for and financial performance for some of the measures is a resident-based analytic perspective so those are really two they're very different and we work really hard to try to reconcile that so that's one thing that I wanted to bring up and then the other piece which is maybe just escaped me okay now I've forgotten it so I guess I'll just leave it um sorry yeah I just wanted to add that as a reminder great yeah but thank you and if it comes back to you you know just reach out and we can talk um all right I'll turn it over to the healthcare advocate for any questions or comments thank you I just have one question and thanks Michelle for all your work on this if you wouldn't mind going to slide 21 this is a question for folks from diva I don't know it's a ways back yeah so just on the first bullet I was curious if folks from diva could talk about what they think was the cause of these you know the difference unexpected and actual for both the you know the ACO attributed members and the expanded attribution cohort thanks I think this is Alicia I think what we've seen in the last several years is performance following a similar trajectory with the actual experience being slightly less than the expected total cost of care for both our traditionally attributed cohort and our expanded attribution cohort I think honestly the the last several years it's been a little bit difficult for us to pinpoint the the drivers of the difference and I think utilization patterns looking different in the last several pandemic and post pandemic years as compared to benchmark years which were based on a pre-pandemic level of utilization is one of the things that accounts for that I think that you know would certainly welcome input from the one care team as well I know we'll do some additional analysis of some of what they're looking at within each of the case but since this experience was relatively consistent with experience that we've seen in prior years we didn't have anything additional that we had flagged as a as a driver for 2022 okay thank you okay I'll open up to public comment via the raise the hand function Ms. Wasserman yes thank you I have a number of comments on both the one care 2022 quality results as well as the financial results and I'll start with the quality I just like to point out that the overall quality scores in my view are abysmal after all these years one carers quality appears to be spiraling downwards Medicare has a score of 66 percent that's the lowest ever Medicaid 65 percent the lowest ever MVP 45 percent the lowest ever and Blue Cross Blue Shields basically sees no difference between ACO and non ACO lives so you know we're talking lowest lowest ever lower than the 2020 year of COVID lower than the 2021 year of post COVID so it's you know when we say that their level I don't see the level I don't see level I see a downward movement additionally in terms of a comparison between non ACO and ACO clients patients I think it's instructive to look at one care's presentation several weeks ago in their CPR program it turns out that CPR practices perform significantly fewer adult wellness visits than non CPR practices so that was slide 32 if you care to go back and look at it this is one care's presentation on November 8th. Annual wellness visits are as we all know one of those critical indicators of the kind of care that patients are receiving and so I found that kind of a surprising data point that the non CPR practices actually perform more adult annual wellness visits than the non than the CPR practices. Moving on to the one care's 2022 financial results I just like to take an overview and look at it by category so in terms of losses Blue Cross Blue Shield primary had losses or I should say one care had losses in Blue Cross Blue Shield primary one care had losses in MVPs QHP then the next category is no savings and that is applicable for Blue Cross Blue Shield QHP Medicare in my view the savings were meager 490 thousand dollars but in Medicaid the savings were phenomenal almost 12 million dollars so my question is if most all of the one care's payer programs in 2022 had pretty marginal results pretty marginal financial results how can we explain one care's phenomenal Medicaid savings of almost 12 million dollars physicians as we all know are payer blind and they treat all patients alike that's been actually mentioned a couple times in today's hearing so you know is there any way we can find out exactly what one care did to achieve this 12 million dollars in Medicaid savings in 2022 can one care name the interventions responsible for these savings so and another question is you know is diva doing due diligence in negotiating its initial contract with one care and I guess I'm curious about the methodology because the distinction between Medicaid savings and all the other payers is pretty dramatic by the way the term savings is a bit of a misnomer these quote savings do not make health care more affordable and in no way do they reduce the cost of health care rather it's merely a transfer of public funds to the private sector and finally I'd like to mention that uh or recommend that the Green Mountain Care Board and AHS should determine a way to take savings as we call them and read and use them to reduce the cost of care how can we take savings and make health care more affordable thank you thank you um but one of the questions there was pretty similar to one of that I was trying to ask but perhaps quite a bit more eloquently and if one care had any thoughts on it I would appreciate it on the question about the high level of 2022 Medicaid savings versus the other programs being flat ish generally do you have any thoughts on how we should contextualize that or understand that well I can't answer that obviously I would love to be able to and I do think we should ask our CFO if he has insight about that he would be the most expert opinion I would love to say it's due to care management and hopefully there is a connection there so we will we will look into that and see if we can answer with data and give you the truth of that I think the point about the upfront negotiation being a possible factor is a good one so Derek you may have Derek you have a better answer than what I just gave well I was just going to say that there's also some variability in the way the programs are designed so like for example the entire blueprint cost is built into the Medicare target and spend so in as much as I understand it's built in on both sides but that's something that doesn't exist within the Medicaid program so that's I mean that savings that's earned that's pre-funded to the network by way of the blueprint program so like things like that for example just make it really different you know the way that the target was set with with the commercial payer for 2023 was very different from the way we set our Medicaid target versus how we set our Medicare target so I mean I don't mean to to refer to general things to answer a very specific question but I think it's one of those devils in the details kind of things where you really have to look at the targets and the methodology separately to sort of explain the differences that go into that understood right okay well thank you and if you do have thoughts after you speak with them others that you want to share we'd certainly be interested in hearing them Mr. Carpenter Walter hey Walter it's fine Owen no need for that how are you I'm okay how are you I'm really sick right now with a virus or something but speaking of that just a couple things first I want to say that I fully agree with Tom Tom Walsh's previous comments about that I think he was the one that nailed it about what's going on with all these charts and graphs uh Julie Wasserman comment about transfer of public funds to private companies is also right on because that's what's really going on here um and I wanted to ask uh and then another comment is that when we talk about payers and I reiterated this the four umpteen times like a broken record the insurers are not the payers we are the insurers are the ones who distribute it they are the middle people we are the payers the last comment is of improving access to health care which I found really interesting because as a patient for me knowing that I have a deductible even though I'm on Medicare and some of these crazy supplement programs is what prevents me from speaking medical care because I know that I'm going to get hit with an astonishing bill later on that the access problems are caused by that deductible co-pay if not by any of the other things that we've talked about here I mean yeah transportation is um but at bottom the real culprit is that you can't afford the access because if you have a six thousand dollars deductible how can you do that most people living paycheck to paycheck can't afford any of that so when we talk about improving access I'd like to hear more about what you can do or what we can do to mitigate co-pay deductibles and stuff like that which are insurance companies passing costs on to us and other than that I kind of agree again that a lot of Tom and with Julie about the and with some comments that Owen has made too about how flat a lot of these charts are thank you Walter and I hope you feel better I barely made it through the meeting well we're glad you did um and last but certainly not least Mr Davis thank you mr chairman um there's a of course you've got a huge volume of stuff go by today I just got two or three comments uh they you can't come close to getting all of it but the first thing is that I just don't think one care Vermont really really can't control quality it doesn't have enough power if the the the reality is the reality is that the you already have data in your archives okay which is shows that like some of the PQI data the PAU data um the recommendations on what to do with small hospitals that kind of thing um the what what the what one cares one care I think has 31 quality measures I might maybe it's 85 but the row I remember is 31 and every one every single one of those is basically a fill in the box and so did you give somebody a pill or did you call somebody back or that kind of stuff the reality is that you can't the reality is that you that that that the the real question in quality is what's what's the what is the the the actual performance in the delivery of the care and we just don't have any we just don't have many measures for that the ones that you do have okay are very clear that what they show is that the UVM uh system is by far the best in PQI and is is just way ahead of anybody else but this but the problem is that if that this problem this problem this quality problems everywhere including in UVM okay and the but you don't have any but nobody has any real power to go and go after that stuff and so I I just think that that I think what you're doing is that what you really need one care from on for is to do change reimbursement from PIPA service to capitation which is a secret to the whole ball ball of yarn and so in it so in any event the and on the question I thought it was the Mr. Chairman I thought your question is good about trying to figure out why these lines are acting funny that says some things are going up some things are going some things are going down the if you if you want to if the the connection between cost and quality is is obviously is obviously critical but it only runs really in it only really runs in in one direction okay if you improve your quality you're going to improve your costs okay but you can improve your costs without improving your quality and the so the question really is still out there what what are you going to do about the quality of the system and I just think that what you're going to have to do is what you're going to have to do is you're going to have to go into the individual hospitals to any hospital that gets it gets told anything gets told anything by one care can just tell them to forget it they don't they don't have to do it and the reason people are doing unnecessary care low quality care the reason they are is because they believe that they have to have and need the money to keep the doors open so in any event I just think that you know you've got that massive data that we saw today I don't nobody's going to be able to do anything with that you've got you can't get do even the basics and that's not because there's something wrong with you there's nothing wrong with you okay the problem is that the the internal connections the way the machinery is built just just doesn't yield to one care of a month thank you thank you very much any other public comment great okay um well thank you everyone you know these presentations go by fairly quickly um and there's a lot there and I know the work that goes into them is massive and so we get this really nice presentation that's here and and easy for us to digest but there's so much more that work that goes on beyond it behind it to make that happen so I just want to recognize that and thank everyone from all the um payers and one care and um our team as well Lindsay and Michelle so thank you very very much um is there any old business or new business to come before the board and is there a motion to adjourn so moved second well those in favor say aye aye aye we are adjourned and have a nice afternoon