 All right, it's 1230. So we'll call to order the Green Mountain Care Boards November 21st, 2022 board meeting. Amon Foster, and today we have a presentation on the ACO 2021 financial settlement and quality performance panel, which will have a number of folks presenting to the board. First, I'll turn it over to Ms. Barrett for the executive director's report. Thank you, Chair Foster. Just a friendly reminder that we do have several ongoing public comment periods. They are listed on our website under public comments, and please submit any comments you have through that website portal. The areas we're accepting public comment on are the HIE plan, the five year HIE plan from the state of Vermont, one care of Vermont's budget, the all-payer model to your extension, or I should rephrase that, the all-payer model extension, and then the next potential model for the all-payer model. And then I also want to remind folks that the board is going out on the road on December 5th. We are headed down to Rutland, Vermont. We will have more details on the plan for that day on our website later on this afternoon, but for the public to know that our meeting will start at 1 p.m. on December 5th, there'll be an in-person as well as a virtual way to access that meeting. And with that, I will turn it back to you, Mr. Chair. Thank you very much. We're really excited about the December 5th hearing in Rutland where we'll get to meet with a lot of folks from the community and various treatment providers. Before I turn it over to our panel today, we had a lengthy board meeting last week with a number of topics and we have board minutes from November 16th, 2021. Is there a motion to approve the minutes? I'll move approval. I'll second. Is there any board discussion? All right, those in favor of approval of the minutes from November 16th, 2022, please say aye. Aye. Aye. Aye. Aye. And the motion unanimously carries and the minutes are approved. With that, I'll turn it over to Michelle DeGree who is our Health Policy Project Director here at the Green Mountain Care Board. And I'll have each of the presenters introduce themselves. So Michelle. Thank you, Chair Foster. Can everyone hear me okay? Yeah, all right, I'm getting up and running. And are you able to see my screen? Yeah, okay, great. I had asked the presenters to introduce themselves at the beginning of their respective sessions. Chair Foster, if that's okay with you, it might make more sense. We can certainly have a full introduction now also if that's preferred. Whatever you prefer, Michelle. Thank you. Okay, let's stick to before each section of the presentation. So with that, I will go ahead and get started. So today, as mentioned, is a review of one care per month 2021 financial settlement and quality performance relative to their respective payer contract. So we're talking, we'll do a quick introduction and background. We'll talk about 2021 results through each of the payers represented here. So we have Medicare, Medicaid, commercial, both Blue Cross and MVP and then have opportunity for ACO comments, although they're mostly here to just respond to any board questions or comments. Just a reminder here that in as in years prior, staff today, GMCB staff play the role of Medicare. So if there is a question that Lindsey Kill or myself are not 100% confident on, we will have to come back to you with that response. And we don't wanna misrepresent our federal partners. It puts us in an interesting position as we'll staff and in this case, a contractor of ACO. So just wanna make that clear. But as we move through, if there are questions specific to Medicare, we might have to defer a little bit, but we'll do our best to get a very timely response. So given it's a holiday week, I cannot promise anything with that. So today, again, is a discussion related to the board's ACO oversight authority. Specifically, this is under section 5403A4 of certification where the board requires financial and quality performance results under payer contracts. So this is just a public review of those results. Quality performance today that's discussed and financial is a reflection of the ACO's performance relative to its individual payer contracts. And does not necessarily reflect the ACO's contribution to the state's performance within the all payer model. We do recognize there's some alignment and I will certainly talk through that a little bit. So again, today we're focused on 2021 ACO payer performance. Evaluation of the ATM is ongoing. We will be producing, we GMCB staff will be producing financial total cost of care and quality reports on an annual basis. We're working on the 2021 versions of those receipts today. So as promised, I know this is kind of hard to see. Just a review and I've done this every year. So I just wanna make sure where there is, try and provide some clarity to what's a somewhat muddy system. So the payer contracts are roughly based off of the measures that are in the all payer model. So you'll see quite a significant amount of alignment in the measures that are reported and sort of where they fall within the APM data set. Similarities across programs are more noticeable. Beginning in 2019, this is just showing 2021 given the availability to recommend design changes to the Medicare initiative during that year per the agreement language Medicare measures for 2019 to 2022 are in better alignment with other ACO and payer programs that are in operation. So the differences that remain here are primarily due to types of covered lives. So if you think about adolescent measures for commercial and Medicaid, but not necessarily for Medicare. I know that it's hard to see again, but a reminder that there's intentional overlap of the model and this isn't a review of the model. With that, I will put on my Medicare hat. I'll actually turn it over to Lindsay to put on her Medicare hat and we can get started. Thanks, Michelle. Hi everyone, this is Lindsay Kill. I am one of the analysts on the data and analytical team at the Green Mountain Care Board. As Michelle pointed out today, I'm putting on my Medicare hat. So I'm gonna be speaking on behalf of our federal partners. Any questions that I can't answer for you today, I'm happy to get that information as soon as I can. So I just have a couple of quick slides talking about the financial piece of the performance here 2021. First is just looking over time at the One Care Vermont Medicare participation. On your right, we have a graph, a chart showing from 2018 through 2021 for those first four years and then we have perspective for 2022. We have the number of prospectively aligned Medicare beneficiaries. So that is the population of people set out the prior year that we think or hope will be part of the model expecting some fallout. And then the solid lighter blue line underneath are those that are included for settlement. Just as a reminder that the differences between these populations, the beneficiaries in order to be included for settlement, beneficiaries must maintain eligibility for the entire performance here or up until they pass away. And they also need to receive the majority of their primary care services in the ACO's network. And notably on this chart, starting in 2020, we see that there are substantially more beneficiaries that we lose between the prospectively aligned population and those included for settlement because of the increased enrollment in Medicare Advantage. So that's just something that we're experiencing in this state. And so you see that reflected there in those differences. Next slide, Michelle. So this is unfortunately a little difficult to read, but it should be on the website. This is an updated performance year 2021 shared savings and losses report put together by our partners at the Lewin Group. I'm gonna go through it kind of quickly, but starting online for the total amount for the prospective benchmark for this year was 492.5 million. The shared savings advance amount adjusted from the, from earlier this year, what is now 8.7. And so adding those two things together, we get 501 million. And that's a benchmark for 47,575 beneficiaries. I will just call out that we look at these populations separately. So where you see A and D, that's agent disabled. And then the next column ESRD is for the end stage renal disease population. So we blend them together for this per beneficiary, per month target. That total target for 2021, the blended target is $821 per member per month. And then jumping down to line 10, the claims-based payments is just shy of $258 million. And not far from that is the AIPBP fee reductions. So those are the fixed payments on behalf of Medicare beneficiaries. AIPBP stands for all inclusive population-based payment. That is the 200 and almost $225 million. We adjust that for uncompensated care. And that brings us to the total performance year 2021 part A and B expenditures for the settlement report of 478.9 million. Michelle's gonna talk about the quality piece a little bit more, but that next section there on the report is just talking about the quality piece. Notably, we have a $0 adjustment of line 16. Our gross savings and losses is 22.3 million, but there's the ACO cap amount, which is 2% of that adjusted performance year 2021 benchmark. That's the 10 million figure that you see there. When we subtract out that shared savings advance that 8.7 million and a small amount of 2% for sequestration per the agreement, then our final settlement amount is 1.2 million. Next slide, please. This chart here is just showing some payment trends between fee for service amount. So that's our claims payment as usual and the AIPBP, that all inclusive population based payment. We can see the ramp up from 2018 to 2019. We can see the reduction in total spend in 2020 with the public health emergency. And then in 2021, we see that starting to come back up. And notably that the AIPBP portion is getting closer to 50% of that total Medicare payment. Next slide, Michelle, please. And this last, this is the last slide for my piece, but this is just looking at a high level, those figures from the settlements and reiterating what I mentioned from the settlement report. But you can see here all of the year is kind of side by side. So the gross savings and losses, the cap in 2020 and 2021, the ACO maxed out that cap. So those two numbers are the same. The quality adjustment there is not, was not a quality adjustment in 2021, which Michelle will talk about. The risk arrangement in 2021 was 100%. And then we have that 10 million figure again for the adjusted cap shared savings and losses and subtracting out the advanced shared savings to get that final net settlement adjusted for advanced shared savings. And I think that's the end of my slides. Back to you, Michelle. Thanks, Lindsay. So for 2021 quality performance, we'll look at the ACO's performance in the Medicare space in four domains. Patient and caregiver care coordination and patient safety preventive health in the at-risk population. So to Lindsay's prior point, due to the ongoing public health emergency, all measures were reverted to pay for reporting in 2021, which results in a 100% score for one care per month. And thus, there's no quality adjustment to their bottom line. And a couple of considerations here. I just want to flag. So the ACO score, you know, as I said, they earned 100% of ACO scores, also calculated using pre-COVID points rubric based on the raw score for each measure. Using this scoring methodology, the ACO would have scored 82.5%. And just to say for measures with no benchmark comparison available, we just assumed full points the same way that we calculated it in 2020 when the same public health emergency was in place. The largest shift in the score is due to CAHPS performance. And this is largely due to the fact that benchmarks changed between 2019 when they were simply just desile-based and 2021 where they were updated to reflect actual performance percentiles. I will also add that in 2021, fewer measures have the possibility of earning points, which also adds to that sort of full scoring points assumed where there was no benchmark. Just a reminder of CAHPS performance on the slide here. At 2019, currently is the only year in the model years that we had a pay-per-performance year. Overall, for 2021, the quality results were mixed. The ACO performed better in 2021 compared to previous years on some measures and worse on others and performed well against benchmarks for some and not as well on others. We do believe that deferred care due to the COVID-19 public health emergency was continuing to impact care and utilization trends in 2021 and likely had an impact on the ACO's performance on their quality measures. While their performance is worse on some measures compared to before the public health emergency, many are now starting to trend in the right direction. So the first set of responses that I'll show you here are the CAHPS results. Again, if you're familiar, the CAHPS survey is sent following hospitalization. It's sort of a Likert scale, a rating-based scoring. So the comparison here to 2019 performance is because no CAHPS was collected in 2020 for anyone. This wasn't just for one care per month. This was completely paused in 2020. So the 2019 rates I just showed here for reference. A couple of other highlights, the response rate in 2021 on the CAHPS score was 6% lower compared to 2019. And again, the CAHPS benchmarks were changed. So I'll go into that a little bit. There was a smaller sample in 2021 likely due to deferred care as the sample for CAHPS surveys is based on E&M claim utilization. And I'll just note here, the CAHPS measure with the biggest performance rate change was the access to specialists, which decreased from 77% in 2019 to 69.4 in 2021. This measure is likely heavily impacted by the public health emergency, which may be why it was designated as paper reporting for 2021. And that was prior to making all measures paper reporting. So that was a change that was made during the performance year. The benchmarks were changed between 2019 and 2021. In 2019, the benchmarks were calculated as deciles. So the 30th percentile was 30 to 39%. And the 90th was 90 to 99. In 2021, the benchmark percentile ranges were updated to actual percentiles. So as an example, and this gets weedy and I apologize, but I think it's important. For example, the CAHPS timely care appointments and information, so that first measure you see, the 30th percentile was updated to 81.2 to 82.2. And the 90th percentile updated to 88.1 to 89.4. Compared to 2019 when the 30th percentile was simply 30 to 39. So that's a pretty drastic change there. That measure, the timely care appointments and information would have had the biggest decrease in awarded points if the measure had not been paper reporting. In 2019, that measure would have received 1.75 points based on the scoring methodology at the time compared to just three quarters of a point in 2021 if it had been paid for performance. So again, that drop in score that we saw. However, their performance on the measure was better in 2021 compared to 2019. So the change in potentially awarded points is most influenced by the updating of those benchmarks. And moving forward, it's important to note that CMS updates benchmarks annually to reflect changes in performance nationally. And this is not specific just to the ACO's performance in Vermont. Their rates are, but the benchmarks are not. Okay, here is the list of the claims-based measure results. So the clinical quality measures. In 2021, the ACO performed better on all of their clinical quality measures compared to 2020. So this is ACO 14, 17, 18, 19, 27 and 28. So these right here. And then the claims-based measures in 2021, the ACO performed better on some and worse on others with the difference in performance being minimal, less than 1.5% in most instances, as you can see here. Most improvements were marginal with the exception of the follow-up visits here, FUM, which is follow-up after the discharge from the Emergency Department for Mental Illness, which is FUM. They performed 5% worse on initiation and 1% worse on engagement also in this following measure. Measure specifications. And we talked about this last year for ACO 8 and 38 changed dramatically between 2019 and 2020. So performance for 2021 should only be compared to performance for 2020 and moving forward. So we kind of have to make a distinction between the 2018 and 2019 performance under the model and then 2020 and beyond. And I know that's a lot and kind of confusing. So with that, I'm going to turn it to our next presenters from the Department of Vermont Health Access. So Pat and Amy, if you could introduce yourselves and let me know when you'd like me to advance slides. Hi, good afternoon. I'm Amy Kuhnrod and I am the Director of Operations for Accountable Care Organization Programs at the Department of Vermont Health Access. And I'm here with my colleague Pat Jones who will introduce herself after I run through some of my portion of the presentation. And we're here to talk about the VMNG Programs 2021 Quality and Financial Performance. Next slide. I'll just frame this a little bit to a little bit of level setting. The VMNG Program is reinforced by Diva's priorities. Diva has three priority areas that it's been focusing on over the last six years or so. And two of them pertain to the VMNG Program. One of them is related to value-based payments which is a big goal at Diva. And this model is a Keystone Program of that goal. And there's also a goal related to performance. And so by implementing the VMNG Program, Diva's been able to focus on Medicaid being a predictable and reliable payer partner. And we're also able to focus on continual and incremental programmatic improvements as we make changes year over year to this model. Additionally, the program has given Diva opportunities to align Medicaid with other similar payer programs in the state and to also be an innovative leader and to test new ideas that other payer programs might want to align with in the future. One example of this being Diva's expanded attribution methodology for the VMNG Program, which I'll get into further later on in the presentation. Next slide. Just an overview of the history of this contract. The original VMNG contract was signed in 2017 and it was a one-year agreement with four optional one-year extensions. Diva and OneCare Vermont triggered one-year extensions of this contract for all of those four years. So 2018, 2019, 2020 and 2021. Additionally, in 2021, Diva issued an RFP to continue contracting for ACO services for a 2022 performance year. And OneCare Vermont was the successful bidder through that RFP process. Diva and OneCare then negotiated a new one-year contract with the possibility of three one-year extensions with a start date of January 1st, 2022 and is currently negotiating the first of those three one-year extensions for a 2023 performance year. The PMPM rates associated with the program and its payment model are renegotiated annually since the attributed population has tended to change year over year. And reconciliation typically occurs annually for the program, but it can occur more frequently if it's deemed necessary. Next slide, please. I'll just briefly speak about the impact of COVID-19, ongoing impact of COVID-19 and related program changes for the 2021 performance year. So during 2021, the COVID-19 pandemic and the associated public health emergency likely impacted many aspects of the healthcare system in Vermont, including the ACO's financial and quality performance in the VMNG program. In alignment with programmatic adjustments that were made at the federal level in 2020, Diva had modified some of its contractual provisions to hold providers harmless for COVID-19 related impacts to cost, to quality and to utilization during the 2020 performance year. And Diva has since carried some of those modifications into the 2021 performance year. Specifically, these continued provisions included decreasing the downside risk in the program proportionally to the number of months in 2021 that were in an active federal public health emergency, which ended up being 12 out of 12 months, which reduced the downside risk in the VMNG program for 2021 to 0%. And we also continued the policy choice to remove COVID-19 episodes of care from the calculation of the actual total cost of care, since that spend was not included in the expected total cost of care for the program when developing rates. Next slide. And okay, now we will start to get into 2021 VMNG specific program performance. One thing that we'd like to remark upon and emphasize is that the VMNG program continues to be stable in terms of both its size and scope, as shown in the table on this slide here. As we can see, the program has grown in the first four years, 2020, 2017 through 2020. And then it leveled off in the 2021 and 2022 performance years. The number of communities or health service areas participating in the program has remained at 14 for the last three performance years. So high and stable. And provider participation has remained pretty constant in 2021 and 2022. And the number of attributed Medicaid members has remained stable or continued to increase. We think that this combined with the use of an expanded attribution methodology for 2020 through 2022 has indicated that the program may have reached scale for Medicaid and may not see much more significant growth in future performance years. As a reminder, for 2020, Medicaid began using a methodology that would attribute Medicaid members to the program, both if they had a demonstrated relationship with a primary care provider, which is the traditional attribution cohort in the program, and also attribute them if they don't necessarily have a primary care provider but have a full Medicaid benefits package, which is our expanded attribution cohort. So we have two separate cohorts but are both included in the VMNG program. And the program still continues to exclude members who have a primary care provider who's not participating in the OneCare Vermont ACO. And I'll just note that though, 2023 attribution numbers are not yet available, they're not here yet. Provider participation in the VMNG program for 2023 has remained stable and looks similar to provider participation in the 2022 performance year. And next slide. Just to really quickly review the details of the VMNG payment model, one of the primary characteristics of the payment arrangement between Diva and OneCare is that we negotiate an agreed upon price for the attributed membership for each VMNG contract here. And this agreed upon price is also known as the expected total cost of care for the services provided to these attributed members. This is illustrated in the green bar on the left hand side of this slide, which is 100% of the total cost of care. Additionally, the arrangement that we have includes a risk corridor, which is illustrated by the color dotted lines, whereby if OneCare spends between 100% and 102% of the agreed upon price, or between the blue and the red dotted lines there, they are liable to pay money back to Diva up to 102%. But if they spend over 102% of the agreed upon price, they're only liable for that first 2% of overage. Conversely, if OneCare spends less than the agreed upon price, or between the blue and the green dotted lines there, they're entitled to retain the difference between actual performance and that full 100% of the agreed upon price for the first 2% of underspend. This creates an incentive to be efficient with resources within the VMNG program within the risk corridor. Next slide. So in terms of the VMNG programs, 2021 for financial performance, Diva and OneCare agreed upon the price of healthcare for attributed members upfront, and spending for ACO attributed members was approximately $15.1 million less than expected for the traditional attribution cohort. And the traditional attribution cohorts expected total cost of care was around $230 million. And spending for ACO attributed members was approximately $6.4 million less than expected for the expanded attribution cohort. And the expected total cost of care for the expanded attribution cohort was around $46 million. Because the expanded attribution cohort is still relatively new to OneCare, the traditional and the expanded attribution cohorts have distinct risk arrangements and they were reconciled separately. OneCare is entitled to the full amount of funding below the agreed upon price and within the risk corridors that we've set up for the traditionally expanded cohorts. And after application of other necessary adjustments in our calculations for financial performance, Diva will issue OneCare a reconciliation payment of approximately $7.1 million. Next slide. And here is another way of graphically depicting the ACO's financial performance for both attribution cohorts for 2021. The expected total cost of care is the total of all of the components in the two bar graphs. The yellow portion of that is the prospective payment that is issued to OneCare at the beginning of each month for its attributed membership. The orange portion of the bar graphs is the fee for service component of the expected total cost of care that Diva retains and issues to providers throughout the year on OneCare's behalf. That's for providers who may not be ready to take on being paid prospectively or providers who are outside of OneCare's network. And the gray there is the difference between what was spent and the agreed upon price and is owed from Diva to OneCare up to that green dashed line there. I'll just remind folks that due to the public health emergency there was no downside risk for the program for 2021. So if OneCare had overspent it would not have been liable for any of that overspend but in 2021 the spending was less than the expected total cost of care for both of those cohorts. Next slide. And here is that same graphic depiction again of the expected total cost of care broken down by those portions, the prospective portion, the fee for service portion and how actual performance interacted with the risk corridors. But this is for all of the years of the program to date for which we have results. The first three from the left bars are the 2017, 18 and 19 performance years. And then for 2020 we show the traditional and the expanded cohorts separately. And for 2021 we also show the traditional and the expanded cohorts separately. There have been years, you can see here where OneCare was entitled to a payment because they spent less than the agreed upon price specifically in 2017, 2020 and 2021. And there have been years where OneCare was liable to pay diva back for an amount in excess of the agreed upon price which occurred in 2018 and 2019. Next slide. And I think one of the key takeaways from the VMNG's 2021 performance has been that prospective payments issued through the program, those yellow parts of the bar graphs in previous slides created and continue to provide a measure of stability in the healthcare system during COVID-19. As noted above, and as we've all seen, COVID-19 resulted in system-wide decreased utilization of healthcare services in 2020 and it likely continued to impact the utilization of healthcare services in 2021 with increased volatility and utilization and revenue as the system starts to come out of the pandemic, though it's still going on. We think that providers who received fixed perspective payments through the VMNG program were better able to weather this volatility in 2021 since they were able to receive a predictable and guaranteed payment from Medicaid through the VMNG program. We think that this continues to underscore the importance of revenue predictability for providers as Vermont looks towards evolving its population-based payment models and increasing participation in them in future years. Additionally, the reconciliation payment that we discussed for 2021 to one care will allow for additional resources to be directed to the healthcare system as COVID-19 related pressures continue in 2022 and beyond. And at this juncture, I will turn it over to my colleague, Pat Jones, who will walk through the program's quality performance for 2021 and then briefly discuss future opportunities for the VMNG program. Great, thank you so much Amy and good afternoon everyone. My name is Pat Jones. I'm Deputy Director of Payment Reform for the Department of Vermont Health Access. So I'll walk through a high level summary of the quality measures. And then as Amy said, spend a minute on future direction. So in 2021, the measure set for the VMNG program contained 10 payment measures and three reporting measures. And one of those reporting measures is the consumer assessment of healthcare providers and system survey, a patient experience survey that consists of eight composite measures. The high level result, we did reinstate payment for performance in 2021 after taking a high aid as in doing reporting only for 2020. So in 2021, one care's providers earned a total of 13.75 points. There's a total of 20 possible points for those 10 payment measures. Each measure worth two points. And so the quality score for 2021 was 68.75%. We have nine measures in the measure set that have national benchmarks. Most of those come from NCQA Quality Compass with one exception. And in 2021, quality performance exceeded the national 90th percentile for three of those measures. And the 90th percentile is generally considered the highest achievable benchmark. So obviously exceeding that is an accomplishment for our providers. We exceeded the also high 75th percentile for two measures. We are, the one care's providers exceeded the 50th percentile for another three measures and performance was below the 25th percentile for one measure. And in a minute, when we get to the table, you'll see which measures fell into which category. So based on, I'll note that there is some rigor to the scoring system. The only way for providers and one care to get all of the points, to get the two points for a measure is to exceed that 90th percentile. They get one and three quarter points of the 75th, one point at the 50th, and so forth. So that's the scoring system. So based on that performance, one care's network providers will receive just about 1.6 million in quality incentive payments through the value-based incentive fund that one care administers. So next slide, please, Michelle. So again, a hard to read slide here, but just to sort of walk you through the format of this table. The first column is the measure description in shorthand. And then you'll see three TAN columns and those are the results for the traditional cohort that Amy described a little while ago. And those are the results that we score against because again, these are results for members who have a relationship with a primary care provider. But we also wanted to share the results for the expanded cohort as well. So you'll see numerator, denominator and 2021 rate for the expanded cohort in the subsequent columns. Then back to a TAN column, that shows the 2020 rate for the traditional cohort that can be used as comparison. And then similarly, the green column after that shows the result, the 2020 result for that expanded cohort. And then in the blue, where we have national benchmarks, you can see what they are. For the quality compass measures where we get that NCQA quality compass results, there's 25th, 50th, 75th and 90th percentile benchmarks. There's one measure, the developmental screening in the first three years of life measure where we use benchmarks from CMS. It's actually 30 states that are participating in collecting that measure. And there's no 90th percentile for that. And then finally, in the FAR column, you'll see the points that were awarded and how that added up to the 13.75 total quality performance. Just a couple of highlights, you can see the color coding in that third TAN column. That's where you see where either the measure was at a particular percentile or for a couple of our measures where we don't have national benchmarks. It shows whether the performance was statistically the same as in 2020 or in one case, there was statistically significant improvement over 2020 results. And that's the screening for clinical depression and follow-up plan. And so as a result of that significant improvement, one CARES provider's achieved two points on that measure. So green, good, 75th or 90th percentile, yellow above the 50th, so pretty good. The one measure, and this is one, I think for those of you who have been around for a while, this is the initiation of alcohol and other drug dependence treatment and it has a companion measure, the engagement of alcohol and other drug dependence treatment. But that initiation measure, we were below the 25th national percentile. We could probably talk all day about factors that may be contributing to that and you hear about some potential factors in your work. Workforce certainly is something we hear about the lingering results of the pandemic, but that's a measure that folks have been working on for a while and obviously we need to continue working on. So some good news there in terms of performing at high levels, there's certainly some measures where we're performing above the 50th percentile, but we'd like to do better. And then that one measure where we see a challenge. I know my colleagues from Blue Cross Blue Shield are gonna show a really nice slide, which I wish I had thought of, which is what's moving in the right direction and what isn't, but when I saw that, I thought I would highlight for you because there's some real similarities. When I'm about to say you'll hear again when Blue Cross presents, but looking at performance changes from 20 to 21, there were quite a few measures that were moving in the right direction. Emergency to follow up after emergency department visits for mental health, adolescent welfare visits, all cause admissions for people with multiple chronic conditions, hemoglobin A1C poor control. So hemoglobin A1C greater than nine. Both of those measures, the all cause admission and the A1C poor control, those are inverse measures. So lower rates are better for those measures. Controlling high blood pressure improved and screening for clinical depression and follow up plan, which I mentioned before also improved, in fact, significantly. The measures that pretty much stayed the same were the ED visits, follow up after ED visits for alcohol and other drug abuse independence treatment and then follow up after hospitalization for mental health. Those stayed the same, performing on a high level, but when you look at absolute numbers, there's obviously room for improvement. Wrong direction, a couple of measures where results declined, developmental screening in the first three years of life. I think you'll see that reflected in the blue cross numbers and then initiation and engagement of alcohol and other drug dependence treatment. So that's sort of a high level overview of the quality results. So next slide, please, Michelle. So what do we see as future opportunities for the Vermont Medicaid Next Generation ACO program? Diva is committed to continuing to test this model, more currently working on an amendment that would allow the program to continue into 2023. We're looking at reinstating the risk sharing. We've already reinstated the quality provisions, as I mentioned, so that we're now, as of 2021, back to pay for performance, but looking to slowly increase the financial risk sharing to the pre-pandemic levels. We've been able to use this model to test some additional innovations and probably the most significant thing that we're looking at now is to planning an expansion of the fixed perspective payments that are in the model. Those slides that Amy showed previously where a portion of the payment is in fixed perspective payments and a portion is in fee for service. We'd like to see more of the FPP and less of the fee for service. So some type of global payment model, if you will. And it really, we think that has promise for everybody for Medicaid, it allows us to have more predictability in budgeting if we are paying for things in the form of fixed perspective payments. And we think it provides some incentives for care coordination and care along the lifespan. For providers, it makes revenue more predictable and we could not have seen that more clearly than we did in 2020 when the pandemic hit. And that can help providers because it is, we like to say it reduces them having feet in two canoes for a single payer. And hopefully as we continue to align with our commercial and Medicare partners across all payers, so not only is revenue more predictable but program structure incentives and so forth. And then for the state, this type of effort and innovation can hopefully offer a step toward potential future global budget implementation. So I'll leave it there. I think that's the end of our slides and thank you for the chance to present today. Thank you, Pat and Amy at Andrew. I will turn it over to you whenever you're ready. Thanks, Michelle. Good morning. Can you hear me okay? Great, thanks. My name is Andrew Garland. For those I haven't met, I'm the Vice President of Client Relations and External Affairs for Blue Cross, Blue Shield of Vermont. So in addition to working with the ACO and focusing on the all payer model, I also do a lot of work with our clients. I lead all of our sales and account management functions. So I spend a lot of time with the market as well, including reflecting on this program. It occurs to me, Pat, that you all had some really good slides too. And I was pretty remiss in not putting together a context slide. So perhaps before I get started here, I could just give folks a little bit of background, especially since I see there are a lot of folks on this meeting who were not with us when we did this talk last year. So let me just talk a little bit about Blue Cross and our work with OneCare and the all payer model. We've actually been partnering with OneCare since their inception. We've had a formal arrangements with them, even predating the all payer model, some run in years where we worked together to develop a methodology. And we started our arrangement with the shared savings program for OneCare providers even before the all payer model began. When the all payer model started, we moved from that shared savings only approach to a two-sided risk methodology. And we ran that methodology for several years. Initially, it was only the ACA exchange populations that were in our model. So the individual and small group members. But in 2019, we began to attribute self-funded members, ASO members to the program. In 2019, that was on a non-risk basis. We attributed them simply to get funds flowing, more funds flowing to OneCare's provider network to help fund transformation. And then in 2020, we moved that population to a risk model very much like the one we had developed for the ACA membership. Of course, the moment we launched the model, the pandemic landed on our doorstep and effectively we were forced to suspend the risk aspects of all of our programs. And I'll talk a little bit more about that later in the deck. But those pools remain in. It's been important to us to be supportive of the all payer model. So we still have very broad attribution to this program, even though we have suspended some parts of the program to get through the pandemic. Right now, we attribute about 91,500 members in total, about 62,000 of those members are in a risk program. I'll be at a modified risk program. And those roughly sugar off with about 40,000 on the ASO self-funded or large group insured side and about 22,000 on the ACA small group and individual side. So Michelle, I think we can go to the next slide. And we put this up, I certainly won't read it. This is really reference for everybody today and the board members. Just to give you a sense of what we're thinking about when we evaluate programs like this, whether we're trying to decide if we should be involved or how to think about our continuing participation. I really emphasize a few things here at the top of our bulleted list is clinical outcomes that is so important to us whenever we're thinking about becoming in a reformed effort. Risk sharing and payment model transformation is really important, but only to the extent that it enables the clinical system to produce better clinical outcomes. That's what it's all about. So we keep the payment arrangements on the second shelf and really at the top is what can we do to improve clinical outcomes and payment arrangements are really just a tool for us. I'll also just highlight that second bullet a little bit, reduced cost of care for our members and purchasers. For those who don't know, we are a nonprofit, local health plan, we're domiciled in Vermont. Our mission is to help Vermonters be healthier. So when we're talking about reducing the cost of care, we're thinking about premiums and cost shares. This isn't about shareholder dividends. This is about lowering the cost of care for Vermonters, for Vermont employers who purchase benefits and trying to slow the growth in cost share that we know challenges many people's ability to access care. And then I'll just jump down to the first line of that second full paragraph there. Emphasize words like targeted, transparent, transparent and readily understandable. We found in the early days of this work, really predating the all-payer model that we didn't have to think too much about external stakeholders while we were designing these programs. But keeping support for our work with OneCare Vermont and our support generally of the all-payer model has required a tremendous amount of dialogue with the marketplace. And as I assure everyone on this call knows we all read the same news media. There has been a significant amount of skepticism about what we're all working on together. And that's caused us to focus a lot more on words like transparent and readily understandable. Really thinking about how we can focus on the fundamentals of these models and talk with the marketplace and with each other about the basic things that we're doing here, not losing the core principles in the details that can easily overwhelm us as we do this work. So, Michelle, I think we can go to the next slide. A lot to talk about that we've worked on with OneCare, even though these have been relatively quiet times because of the pandemic, these I will go through one at a time and just offer some commentary. So, first and foremost, when I think about the things that are going very well, we have a strong collaborative approach to work with OneCare. Certainly there is a legal dimension to what we do. We negotiate financials, we negotiate contracts, but at the end of the day, our teams have really moved as far as we can towards partnering and trying to put the needs of the stakeholders that we all serve first and foremost. And that has allowed us to have some really tough conversations and to frankly change up in tough times as we've seen with the pandemic. I'm proud of how our organizations were able to talk about what was happening there and what was the best for Vermonters and make the changes that we needed to make to support our community during that time. To get a little more tactical, the next three bullets, we shifted our approach to measuring quality in this program last year. I spent quite a bit of time talking about that, but essentially moving from a place where we were only evaluating quality on a scorecard basis to really evaluating it based on the work plan that was being developed to advance those scores. And I'll talk about this a little bit more in the deck, but we wanted to try to draw a much stronger link between the activities that are happening within OneCare and OneCare's network to the results that we had hoped to see on the quality scorecard and in the financials. So we stepped over the scorecard and said, let's talk about the work that you actually intend to do in 2021, the first year we use this approach, so that we'll have a better ability to interpret the results that we're seeing on the scorecard to really try to draw some cause and effect between your work and the clinical outcomes that we're measuring after the fact. Some similar progress for us on the financial side. The last two bullets are really both about the financial target setting methodology. We have developed a brand new approach to target setting with OneCare Vermont. We've been working on that for much of the year. It's a major change and one that we're pretty excited about. I think it will further isolate the model from some of the external noise, which in the past has made it harder to determine. Again, what in the results that we're seeing are really being driven by OneCare's action and what is random or perhaps being affected by other inputs to the system, the rate process that we all go through together every summer, that's a good example. To a certain extent, I think this new methodology will insulate us from a lot of those outside impacts and help us judge a little more clearly are there places where we are in fact seeing financial gains to the system or losses, frankly, that we would attribute to the activity that OneCare has taken on. And then also very exciting, and we'll talk more about this. We did suspend the risk program largely for COVID, but we're on track in 2023 to begin moving back to full risk and to really start talking about fixed perspective payments for our population that OneCare is helping to manage. On the challenges side, there's a number of things that I think we could talk about here. Number one, the first one we've listed is frankly the most important and really the most germane to our discussion today. We're still unable to produce evidence that the members that are attributed to OneCare are performing better or worse than the rest of our population. As in past years, we see some results that are moving in the right direction, some that are moving in the wrong direction, that's true for financials and quality, but the clearest statement we can make about that movement right now is that we don't yet have compelling statistical evidence that the movement we're seeing is related to the work that's happening on the ground in this program. And for us, you've heard me say some form of that already three or four times in this presentation, finding those linkages between cause and effect, the work that we're doing and the impact that we're having remains probably our number one priority as we partner with OneCare. Going down here, we have some smaller things to talk about or some more detail level things. We very much want to expand our quality focus to include more on the mental health substance abuse side and ideally on the pharmacy side as well, the retail pharmacy side. OneCare's network has been slow to go there with us. I think it is understandable. They've had a lot on their plate with the pandemic, but we have tremendous responsibility on behalf of our employer groups and our members to keep working on those quality metrics. They're tremendously important to our community and they are places where frankly, we have a lot of room for improvement. So we will keep putting those in front of OneCare and trying to find the right time to ask their network to engage on those. Generally speaking, we still feel that there's an opportunity as a system and you'll notice that this fourth bullet is written with a fair amount of passive voice. That's not an unintentional. We were not always sure who the subject of this sentence would be if we were to put a noun in charge here, but the reality is that as a system, I think we still have a lot of work to do to understand how and where care coordination is happening and who has clear accountability for what parts of care coordination. At Blue Cross, Bushield, and Vermont, we have a really clear understanding of what we do with care management, disease management and utilization management, but we are ready for that work to take a big step forward in coordination with the provider network and we really have to get, I think, clearer as a system. And that means OneCare, all of the payers, the designated agencies, everyone who's collaborating to help take care of folks, how are we doing care coordination and how are we measuring our progress? Do we have clear lines of accountability and clear mechanisms to measure what we're doing well and where we need to shift our focus? And then a final challenge that has emerged relatively recently is the proposal to transition the ACO's data operations from third-party vendor over to UVMHN that has definitely raised some concerns from a number of stakeholders, including us about how much data is being centralized and who has access to which information and some of the information being business confidential, some of it being health confidential, a lot of the stakeholders that we work with are concerned about this change. So we're going into a lot of detail with OneCare and UVMHN right now to understand exactly how this will work and what sort of protections will be in place to make sure that the data is handled safely at all times. I'm gonna pause there and just ask if there's any questions before I move into the next sections of the deck. Michelle, would you like to advance to the next slide? I think we might have a, yeah, we can jump by this one. Okay, I do have some numbers for you in my back pocket, but they're so uninteresting, they don't go on the slide. The real story is here that, as I said earlier, when we understood in early 2020 that the pandemic was going to happen for real in the United States, we were very quickly in contact with OneCare and we talked about suspending the risk part of the program. We knew that utilization patterns would be profoundly impacted by the pandemic and that those swings in utilization would initiate payments either to OneCare or from OneCare to us and that those payments would be based on sort of perversions of the normal utilization patterns were brought by the pandemic. And for the sake, frankly, of the integrity of the program, we really didn't want to be making those payments or receiving them from OneCare. We didn't see how we could turn to the marketplace and say, we need to make a $5 million payment to OneCare this year, in the first year of the pandemic because utilization has fallen so low when we all knew that that utilization drop was really about people not being able to get into the clinic because of the pandemic protocols, not real improvement in care. And likewise, we knew that as the pandemic began to alleviate and lift, that we would see a big bubble of bounce back care, avoided care that finally found its way into the clinical system as capacity opened up and we really didn't think it was appropriate to ask OneCare then to write us a big check back for that care. It would have been hard for us to justify asking their network to do that. Or again, to explain to the market why we were making a refund. We thought that big payments in either direction would really cause the market to question the integrity of the program. I was very pleased that OneCare was extremely willing to listen. We had some really good talks about how to change this program. And what we decided to do in the end was to keep the risk program but to reduce the financial corridors for payout from the nearly full risk that we had before the pandemic to very small fixed dollar amounts for each population. And I think we have four defined pools in the program. The highest corridor for any of them is $50,000. A couple of them are at that level and then a couple of the pools are at an even lower level. So that's the most we would pay out for one of those pools or receive back from OneCare during the years 2020, 2021 and 2022. And if you're interested in the actuals they happened exactly the way we thought they would in 2020 utilization was way down. So we paid OneCare the maximum amount for each one of those pools. So I think the total payout was about $125,000 somewhere around there. And then in 2021, three out of the four pools had huge bounce back utilization as we expected they would. One pool surprisingly did not but that pool has only 2,000 members in it. So probably not credible. But at the bigger picture level I think we can think of 2021 as being the catch up year for 2020 and in fact OneCare having to pay us back more or less all that we gave them in 2020 for that for that care that was artificially avoided or prevented by the pandemic. As I said earlier, we are on track to get back to a pretty meaningful risk arrangement in 2023. So a year or two from now or two years from now when we're doing the update on 2023 results we'll be back to a place where we can talk about real performance, we hope. So from here we can move on to talk about the quality component of the program. There were supposed to be some quality folks on with me today to help with the details here. This is not my area of expertise but we've had some flu and ravaging parts of our workforce last week in this. So I may be all alone today. I'll do my best. I mentioned that we built a quality work plan into our contract with OneCare starting in 2021. We continued that approach in 2022 and have a commitment we're still working on finalizing our 23 agreement but we have a commitment to keep going down this road in 2023 and I'll just say again that the goal here is really to try to connect the bottom line measurement of quality with the top line activity that OneCare is organizing to drive those quality results. And the reason for taking this approach from the beginning as we have measured quality metrics in this program we have seen some gains. We have seen some measures that are sliding in the wrong direction but as we look at those measures year over year we're not seeing sustained trends. So measures that improve in one year are just as likely to slide back the next year as they are to continue improvement improving. I think through the duration of the program I'm not sure that we have a single measure that we would say has consistently been advancing or declining and when we look at the way those measures tend to end up the pattern on measures that are, or members that are attributed to OneCare really doesn't look very different than the pattern of outcomes for members who are not attributed. So we're trying to strengthen our understanding there of which interventions might really be driving some results even if it's in a much smaller segment of the population then we would see when we build the scorecard we need to find those wins and help push them through the rest of OneCare's network. You can see that we've been focused on controlling blood pressure and screening for depression in the first two years of the program. I thought we had on this slide but perhaps it's somewhere else. I think we're looking at hypertension and the possibility of another mental health substance use disorder metric as we go forward. It is important to note that we'll be, we do look at the scores of all of these measures still for OneCare. We built a scorecard and in fact I put my eye chart in the appendix for you. I won't go through all those lines as well as Pat can do it but it's there for you to look at. So we're still calculating the scores but we're not really, we're not focused on the scorecard. We're really focused on, hey, what's working? What can we find in this quality program that we can say that's a success? That's unique to this program. Let's talk about it, understand why it happened and how we can do more of it. That's really our focus. I think we can go to the next slide, Michelle. I think this is the slide that Pat meant. You saw an earlier version of this. It might have been a little confusing. I apologize for that. So yeah, as I said, we have some scores that are moving in the right direction and these are in some cases really significant moves. Actually, I think these are all statistically significant but you can see some real improvement on some of these measures. Now where we've noted an inverse measure, that just means that a lower score is better on those measures rather than a higher score. So we want readmissions and out of control blood sugar to be going down and we want well care visits and the control of hypertension, for example, to be going up and all of these measures are moving in the right direction. At the same time, we've had some measures that have slid in the wrong direction. And again, in some cases, pretty significantly, you can see the rate of developmental screenings is significantly down as Pat mentioned, the rate of alcohol and drug treatment initiation also significantly down. And of course, we would caution with all of this. We came into the pandemic not yet having clearly established lines of cause and effect between one care's work and our quality scorecard. And then of course, we bring all of the white noise of the pandemic and we lay that on top of this situation. So it's pretty hard to interpret these results, these year over year results. And I think we will probably say the same thing in 2022 and maybe even in 2023. Again, that causes us to step back from the year over year results and ask, well, are there some broader trends that we can identify? Maybe that's where we need to focus. But at this point, we would say not yet. We just haven't seen those broader trends. I think I have just one more slide. Yeah, so just some notes about the quality data that we just shared. Just a reminder that the large group and self-insured pool did not come into the program until 2020. So as we look at the 2021 over 2020 results, we just want to caution everyone that that's not a lot of data. And before we start to make any assertions about what's happening on that large group side, we would really need to see two or three years of data start to accumulate. And again, with all of this data, the COVID impact may make it somewhat difficult for us to draw firm conclusions about many of the things that we see, even when we've accumulated back two or three years worth of data. And that's all I had to present this morning, but I'm certainly happy to take questions. I think we're gonna save all the questions for the end. And so that just means we've got MVP. So I'm gonna turn it over to Carla. Yes, thank you, Michelle. I'm Carla Renders from MVP. I'm responsible for value-based and physician contracting as well as professional relations in Vermont. I'm also accompanied by some of my colleagues from MVP today just to assist in any questions or follow-up items we may have. But first, let's kick it off with MVP's mission statement. You can go to the next slide. So this is MVP's mission statement, why we are here doing what we do, which is to continue to improve members' health and wellbeing through innovation and collaboration cultivated both internally and in our arrangements with our partners such as OneCare. Our focus is always our members and everything we do. And with them at the forefront of our work, we intend to create the healthiest communities. Next slide, please. So how do we achieve our mission with our core values? We wanna be the difference for our customers, our members, and make them feel reassured that their healthcare needs will be met. We are curious as to their wants and needs and we work to anticipate and address their needs for a better consumer experience. And finally, we are humble. Humility allows us to keep an open mind and be receptive to innovative ideas from all of our constituents, be they employees, members, providers, or our business partners. Next slide, well, next slide. So getting into the 2021 arrangement, this is the financial program overview so the components that make up the arrangement. This is the second year of the MVP OneCare contract and the arrangement that we had this year, well, not this year for the 2021 performance year was very similar to the one we had in 2020 and what we have this year in 22 as well. The program covers lives under qualified health plans meaning commercial, individual, and small group membership sold on the Vermont exchange. This is an upside only total cost of care shared savings arrangement with the amount of savings shared being subject to a quality gate. And we'll look at the quality metrics further on in the deck. The quality metrics that we use for the quality gate were metrics selected from the all payer model. As part of the arrangement, MVP provides one care with eligibility claims and financial analytics or the attributed population on a monthly basis. MVP also continues to provide one care with a monthly primary care investment payment as we did in 2020, which is then distributed to the downstream providers. You can go to the next slide. So let's look now at the financial effects of the 2021 performance year. This illustration demonstrates by quarter how one care performed against the targeted budget. So the gray bar being the budget that was sent, set, and the blue being the actual financials. The budget was set using 2020 utilization data, as we all know, due to the pandemic, that would have set the target considerably low. Therefore, we did apply an inflator to the budget to account for the expected uptick in service utilization in 21. So we ended up with a per member per month budget of $405.57. Although the budget overage was considerable at the point of settlement, it's important to note that the year started out at a much higher overage, closer to 40% over budget, and ended at 25% over budget. So one care did demonstrate improvement over the course of the year. It's hard to say how that improvement was realized like Andrew had stated previously. While the financial results do look concerning, there is a story to tell as to why one care's performance took a turn in 21 compared to the 20 results when we did achieve shared savings, again, due to the pandemic and public health emergency. So moving on to the next slide, and I feel like this is redundant because this echoes what previous presenters had stated about the contributors to what we saw in 2021. First of all, we know there was a rebound in utilization post-2020 lockdowns. We all expected there would be an increase in utilization in 21 due to care being unavailable for a large part of 2020. However, we didn't expect that trends did not return to baseline, rather they vastly increased even compared to pre-COVID years. So we're seeing this trend in Vermont overall, not just in the one care arrangement. Additionally, complexity of services increased. So patients that had avoided care or couldn't obtain care in 2020 were returning to receive medical attention and were presenting with more costly and complex issues. And finally, we have had continued expense surrounding COVID while it was expected in 21 that COVID costs would actually decrease some. Payers were required to continue to incur the member cost shares associated with testing and treatment post-2020. And additionally, quarter one and quarter four of 2021, COVID costs were even greater than the cost we saw in 2020 for the same time period, which again was an unexpected trend. So overall the bottom line in the one care arrangement, financial arrangement is that we have yet to see a normal year and it may be some time if ever before the patterns return to baseline. Although in 22, we are seeing a large drop in costs associated with COVID testing and treatment, the year is still not over. What we expect for the future is that trends in utilization are going to continue to increase due to easily accessible care. As we know, members had utilized medical and behavioral telehealth services and that trend is continuing. And while this is a positive impact, of course, for the access and health of members, it's evidence that we more than likely are not going to return to the utilization patterns we saw pre COVID. So to address this new normal, MVP continues to be an eager and willing participant in conversations about payment reform. We're actively participating in the state led work groups that are looking at ways to contain costs, which ultimately will be passed on to members and improve member outcomes through alternative and global payment models. We can advance to the next slide. So this is an overview of the 2021 quality program. The program was very similar again to the one in 2020 and the one that we have currently in 2022. As follows, the quality metrics were selected by OneCare using the standard all payer model metrics. We use 2020 benchmarks. The point system determined, well, the point system would determine the amount of shared savings if there were shared savings due to OneCare. So this year, since there weren't any shared savings, that is less relevant. And it should be noted when we look at the scorecard, which we're going to do in the next slide, that three of the measures had such low denominators that the points for those measures had to be redistributed. Considering that MVP has a much lower, or smaller population, I should say, than some of the other payers on the call. That occurred last year and it occurred this year again, where we just simply had, did not have members in the denominator. And then we only look at metrics too that have 30 or more members in the denominator. So with that, we can go to the next slide. And this is an illustration of OneCare's quality scorecard for 2021. This is what we distribute to them upon final settlement. And despite the financials, we just saw the quality scores were definitely a bright spot in 21. OneCare was able to increase their performance from 50 points in 2020 to 85 in 2021. While OneCare continued the good work to remain in the 90th percentile for diabetes control and all cause readmissions measures, it's important to know the highlighted areas where there was movement from the 50th to the 90th percentile for child and adolescent well care visits, and then move from below the 25th percentile to the 50th percentile in controlling high blood pressure. And finally, most impressively, move from below the 25th to the 75th percentile in the initiation and engagement of treatment for alcohol or other drug abuse. So that concludes the arrangement overview for 2021, but we wanted to move on and just do a quick retrospective and prospective review. So the highlights of 21 is that MVP and OneCare have continued to work within a collaborative and cooperative team environment. We've advanced our knowledge on how the pandemic has impacted these types of arrangements and we can now plan better for the future. For instance, implementing payment reforms that would keep revenue consistent rather than these fast fluctuations we're seeing from year to year. And as we just reviewed, OneCare had commendable quality performance in 21 and both parties are continuing or participated in state reform work groups in 21. So moving forward for this coming year and beyond, MVP and OneCare are actively negotiating the contract currently for 2023 and we're hoping to tie that up by the end of the year. We will be moving to downside risk for the first time in 2023. We're also reevaluating the quality program to ensure that we're using appropriate metrics for this population. And we've agreed to add new metrics to the program which are in the process of being vetted now to better reflect the population that MVP has under the OneCare arrangement. And finally MVP, sorry, let me back up, that's not fine, MVP will continue to fund the primary care investment dollars which OneCare is optimizing with their population health model for continued improved performance. And then finally in respect to global hospital budgets, MVP has been actively engaged with conversations with OneCare to advance our exploration on how a hospital global budget would work within a total cost of care arrangement. We're targeting 2024 for a potential implementation if we can come to an agreement. There are considerations to be made with in a commercial arrangement as opposed to a Medicaid arrangement with respect to member cost share, changes in member metal levels throughout the year, leakage of services outside of the arrangement. And for MVP it would be important for us to have an appropriate number of hospital participants under the OneCare arrangement to really make the greatest impact. So these are not unsolvable challenges by any means but the devil is in the details and we're working this year and into next to iron out some of those nuances. And finally, we will continue the important conversations that are occurring surrounding payment reform with the state and other payers. So that wraps up our presentation. Thank you. Thanks Carla. I will turn it back over to you Chair Foster for board and public questions and comments. Great, thanks Mr. Gray and the folks from AHS and Blue Cross and MVP. I forgot this earlier when we started to mention that the board has a hard stop at 3 p.m. today. So we have a full hour, so we should be okay but there is a hard stop today. With that we'll turn it over to board questions or comment and we'll start with Mr. Walsh. Thank you Chair and thank you to each of the presenters. I have just a few questions. First, I think I could ask Michelle. Michelle, the payer crosswalk slide way back at the beginning. Yeah, thank you. So one care has all of this data presented to them, correct? So you just heard from each of the columns of the Medicaid Next Generation Medicare Initiative, Blue Cross and MVP programs where the data was presented to you today. The all-payer model measures are specific to the agreement between the state, CMS and that agreement. So this is made to just show you kind of the alignment across all of those programs. Some measures in the all-payer model are ACO specific but it is multi-payer in most cases. So simply adding up scores and that's not, it wouldn't result in the same score that we would use for the purposes of federal reporting. So I'm looking at the screening for clinical depression and follow-up. It's straight across from 2021. Yep. Yep, and it looks like that's a measure in all of the plans except for MVP. Is that correct? Carl, if I mess that up, let me know. I think that's correct. Which metric, I'm sorry, can you restate what the metric is you were looking at? Screening for clinical depression and follow-up. Correct, that is not part, no. Yep. So if I asked OneCare Vermont for the proportion of their patients who have a positive screen for depression and they told me that that would require a manual chart review, that's not actually correct, is it? I believe so from not from the Medicare side, from the staff side, that there would be a claim result for the actual test conducted, but that doesn't mean that the result is part of that. The result is what has to be chart abstracted. And there are measures members from OneCare here who could better respond to that than I. Well, I don't have to tell you, we've just got an hour, but I'd like to understand when we see screening for clinical depression, that suggests to me that patients are receiving a screening test and do, is that 100% of patients should be screened or what percentage, and then of the patients who are screened, what, who has a positive score? And meaning that the score suggests they are depressed, right? I think I'm trying to understand what it takes for OneCare or the providers who are participating to understand things like what proportion of their patients have a positive depression screen. And anybody- I think that Pat has unmuted herself and she can probably speak to us from the Medicaid side, but it does require chart abstraction in most cases. Pat, go ahead. So the measure screening for clinical depression and follow-up plan, it looks at whether a person is screened for depression and if they are screened and the screen is positive, is there a follow-up plan that day? And not that the follow-up occurs that day, but that there's a plan that day. And it is a chart review measure and I can share how we do it for Medicaid and I'm not sure what Blue Cross does. But we provide a random sample. So it's a random sample of records. And so we identify the full universe of people that would be eligible and it's a lot. We provide a random sample to one care who then works with their providers to do the chart abstraction for these measures. And so the result is were they screened and if positive was that follow-up plan there. And if others know more than I do on that, I defer, but that's the measure as we approach it with Medicaid. Okay, I'm not trying to put anybody on the spot or give anybody a hard time today. I'm trying to better understand, right? And it seems to me, if we know the total number of attributed lives in a plan and then we're tracking how many were screened, that's a proportion. And that number should be pretty readily available. And then of the patients who were screened, some will have a positive screen, others will have a negative screen, also a proportion. Of the patients who screen positive, some will need a follow-up plan, also a proportion, right? It's just division. And I'm trying to understand if we have a system in place that can do that, man. So should I, is it okay if I? You can say yes or you need to look more, whatever you'd like, I'm just trying to understand. I also see that Dr. Wolfman has raised her hand from the ACO. So Pat, you're welcome to go, but if you wanna defer to Dr. Wolfman, you can also do that. I'll defer to Dr. Wolfman. That would be my choice. So I'm gonna let her speak. Go ahead. Okay, thank you. I think my colleagues, they texted that they got kicked off of the meeting for some reason. So I guess I'm the one care rep for the moment. And I'm a practicing primary care clinician too. So it's a good question, Tom. I appreciate the question. I regret to say that we cannot collect the data any better so far than we are able to. The PHQ two and nine screens that we all seem to use, I think most people use that, are entered into particular fields in the electronic health records where they can be collected. But if they're not, they cannot be collected. And then there is a score, which could also be collected. The follow-up so far, though, is not something that is consistently documented electronically so that we can gather it. So that's why the manual abstraction, unfortunately. The drive to have annual visits for all, as many ages as we can, is important because those visits are expected to include screening for depression and also anxiety now. And those screenings need to be recorded consistently. So that's one of the other reasons that we are driving forward with expecting more annual visits. No, I appreciate that. I'm familiar with those two screens, the two and the nine. And I'm familiar with how hard it actually is to get those things into the record and then pull them out. But we're supposed to be. Someone somewhere is supposed to be. This is part of, that measure looks like it's part of every plan except for the MVP plan. But it's very difficult to do is what I'm hearing. And we're not sure that anybody's doing it consistently or accurately at this point in time. The next question that I had, still, I think it's for you, Michelle, and I'm not trying to be hard. I'm really sorry if I'm coming across, like I'm the nice professor, right? And it's hard to be the, you said there was a score, the overall score would have been 82.5 for 2021. Is that, would that have been sufficient to receive any savings had there been savings? I'm trying to do math in my head. I would rather confirm that with CMS. That's fair. I'm going to confirm with them. I don't want to speak out of turn. Okay. Or I think that my next questions are pertaining to the presenter from Diva. And I didn't get everybody's name and I'm very sorry about that. And the slides don't all have numbers. So it also makes it really hard to say where I'm, I wanted to know a little bit more about the expanded cohort versus the regular cohort. What makes an expanded cohort? I'm sorry if I missed that, but I'm not sure. No, that's okay. It's complex, like anything related to attribution. It's very Byzantine almost. So the traditional cohort that we have can also be thought of as, I think just the regular cohort for Medicare. It's the original attribution methodology that I think a lot of ACO programs are used to seeing. So those are members who have a full Medicaid benefits package, don't have another form of insurance, and have a primary care provider that's participating with one care. So that is the traditional way of thinking about attribution, which is the way I think that Medicare and other payers think about it as well. And that's our traditional attribution cohort. In 2020, we were trying to get some of the skill targets bumped up. And so Medicaid thought about how it could modify its attribution methodology to also look to attribute more members than just members who were demonstrating past primary care utilization with a PCP participating with one care. And so the expanded methodology goes a step further than the traditional methodology and it looks at basically just whether a member had or has a full Medicaid benefits package, no other form of insurance, and it's not being seen by a PCP who's not participating with one care. So this could include Medicaid members who don't have any utilization in the baseline period that we look at when we go to a tribute, they could be new to Medicaid, and therefore also not have any utilization in the baseline period, or they could be seeing a provider or a specialist that is not designated as attributable. So a designated agency or another specialist or something of that nature. So that's the expanded population, they don't necessarily have a PCP, but some of them are active in the healthcare system and some of them are not, but they're I think overarching qualification for being part of the program is that they have a full Medicaid benefits package and no other form of insurance. Is that helpful? Yes, so I'll try to summarize, just tell me if I'm wrong. The expanded pool is everybody who has Medicaid, the traditional is has Medicaid and has a PCP. Has a PCP that's participating with one care. So we exclude from the attribution members who have PCPs who have decided not to participate with one care Vermont for Medicaid. Would those individuals fall into the expanded bucket? Then they fall off the analysis. Okay, so I'm trying to understand there were some slight differences, not huge but slight differences in the slides about the traditional and the expanded cohorts. And I'm trying to kind of adjust my priors as some like to say, would I expect there to be a difference between those two groups or should they be similar? I would say that they're different because they're accessing different parts of the healthcare system or having the past access different parts of the healthcare system. The traditional cohort has a PCP, the assumptions they're being care managed or at least they have a medical home, so to speak. And there may be more hooked into the healthcare system in that traditional sense. Whereas the expanded population, they might not even have claims. They might not even be accessing or utilizing the healthcare system or they're utilizing more community-based providers or specialists, I would say. They have different utilization patterns is one thing that I would say. And I would also invite either Alicia or Pat to jump in if you also wanna add anything, but feel free not to. I think you covered it well, Amy. Just looking at it from the quality perspective, I think you'll see that some of the rates for many of the measures are lower for the expanded population. And that could be a testament to the role of primary care. A lot of these measures relate to procedures and actions and care that could occur in a primary care office. And so I don't think it's surprising to see the rates lower. Yeah, so I was thinking the same thing. And what's interesting as I look at this table, the place where the rates are much lower than the expanded cohort are for children, right? And so that part makes sense. In the adult populations, the rates aren't that different. And so I wanna keep an eye on that in the future because of those implications. I also wanna just make sure that on slide 24, it looked like savings were achieved in 2017, 2000, but then a loss in 18 and 19. And then savings in 20 and 21, which you're telling us we can attribute to decreased utilization due to the public health emergency. Is that correct? I think that that's probably one of the reasons and one of the drivers. There are probably also many other drivers, but I think that's one of them, yes. Okay, so three years, and three's a small number, but the three years prior to the pandemic, there was one year where savings were achieved, the other two were losses. Yes. In the quality measures, and I didn't get the size, but the overall score here, if you go back one, overall score is 68.75. And then in the next one, the initiation of substance use treatment, that score in the red is really low, right? Is there a quality improvement plan to address these? Is there a corrective action plan? Yeah, I mean, this is a measure where we're seeing low rates across the system, Blue Cross indicated that as well. MVP said that they had seen some improvement, but the absolute numbers are still pretty low. So I would say that working together with one care and its network providers, as we talk now in preparation for 2023, there's a real focus on mental health and substance use disorder treatment in particular. A lot of these measures, in fact, all of them, when you look at the 30-day follow-up after ED visit measures, the initiation and engagement of alcohol and other drug abuse and dependence measures, and then the follow-up after hospitalization for mental health, these are measures that really involve providers in different settings working together to provide care. And a lot of them are around either new diagnoses in the case of initiation and engagement or transitions in care in the case of the ED and the hospitalization measures. So the conversations that we're having now are how can we support providers in admittedly a very difficult environment? You hear about it as much or probably way more than we do about workforce and access to care, but how can we establish programs and set things up so that we're supporting providers and working together? So I'll just say, it's no mistake either in the all-payer model quality framework or in our measure sets, our respective measure sets that mental health and substance use disorder treatment figure heavily. And so there is definitely conversations about and plans about how to support providers in improving these results and in the process, hopefully improving outcomes for our members. Pre-pandemic things had started to look a little bit better, but here we are. And I don't want to conjecture that it's solely the pandemic, but it does appear that that was a factor. I appreciate the background. Tom, the other thing I want to flag, sorry for interrupting and the ACR can certainly chime in here if they want to, but the other thing that the ACO because of 42 CFR part two is blind to some of this data to the results here, they wouldn't receive those claims. So they wouldn't necessarily know who those attributed patients are. Carrie or Dr. Wolfman, sorry, if you'd like to speak to that, you're more than welcome to, but you don't have to in, and I'm happy to have a follow-up conversation with you Tom about that if you're interested. I appreciate the background, but it sounds like there is not a specific action plan to address this underperformance. Hi everybody. Hi everybody. My name is Josiah Bieler. I'm director of value-based care at OneCare. I just wanted to address the specific question. So thanks for having me today. It's really an honor to be here representing OneCare. Thank you. So I think I don't need to repeat the comments around the impact of the pandemic and the reality that we are still in the pandemic. I do want to talk about our program design that we've been spending a lot of time working on to improve. And so our new program for 2023 is called PHM. What we've done is we've taken some of our previously separate programs, blended them into a unified framework that we can leverage to increase accountability for our network as we need to. So for 2023, we have our measures selected. This particular measure is not on there. Again, we don't have access to detail. Sorry. I have a separate question coming up about how do you choose? Yeah, no problem. So yeah. But it sounds like I've tried to be really patient and not interrupt anybody. But three people have spoke now and I don't hear anybody saying there's a plan to address these numbers. But suicide rates are higher than they've ever been in Vermont and we have evidence in these scales that people are suffering, but I'm not hearing an improvement plan where somebody's saying, I'm writing the plan, we're gonna do these things. It's people are having conversations is what I'm hearing. And is that correct? You're talking about making a plan? No, I would say we have a plan. We have a plan for quality improvement for our network and we have to choose the focus areas across, right? So we have 18 measures across these payers. Providers have told us, it's challenging to focus on 18 measures. So we've in the past kind of drill down to here's where we're gonna focus. Yes. And could this be a future focus? Absolutely. Is this a focus for right at the moment of our six measures that we're focused on for 23? It is not. I'd like to see the plan if it's shareable. Blue Cross Blue Shield has two metrics later on in the presentation, hypertension and screening for depression. And I wanted to ask, who decided? How did you decide to focus on those things? Yes. Our list of metrics was actually developed really early in this program. I would say that from Blue Cross perspective, it's okay. We can do better for a commercial population. But when the program was started, we were asked by more or less every stakeholder to try to keep our set of measures limited and aligned with what the government payers were doing. And we understood that in the early days of this program, OneCare and its network would need some time to get organized. But you also heard me say that we are putting some pressure on OneCare's network to do more as the capacity to do that opens up. It's a good list, but for us, it's not great. There is more that we should be focused on. I'm not hearing a clear articulation of a method for deciding where improvement efforts should focus. And I'm not hearing that there is a plan to address the lowest scores that we're seeing. I hear that we're developing plans, but I don't see, you know, on the one hand, that's okay, but on the other hand, we've been going at this for several years now. If I can jump in here and add a comment, this is Derek Rains, Director of Payment Reform from OneCare. I'd like to also add in response to your questions, Dr. Walsh, about improving the mental health measures. We do have our CPR program. In 2023, we're incentivizing practices to universally screen for depression, anxiety, and suicide. And then we are also incentivizing practices financially to embed interventions in the primary care setting, so to pay a mental health provider to provide services within the primary care setting. So those are programs that have actually already, aspects of the CPR program that have already been approved by our board for enactment in 2023. So I would point to that as a specific plan that's in place right now that will take place beginning in January that directly impacts those measures. Okay, so from what would really help convince me there's good things happening here, is that we see that we're at the red box, 32, something percent right now, that we have the CPR plan goes into effect, right? And in 2023, 24, 25, that number should improve. Correct, and measurement of these quality scores is also a component of that incentive program. So it's not just a matter of, hey, we'll pay you this extra money if you do these things, it's we're gonna pay you this extra money if you do these things and work with us to measure the impact of doing these things. So I understand your comment and I think that what I've said is responsive to it. I think so too, it's on the right track. Sorry, hang on, can you guys, sorry, if I'm gonna interrupt, there's too many people randomly jumping in here. There's a raise your hand function and please use that and I'll call on people, okay? So we just, I just can't have five people jumping in. I appreciate you guys providing answers and information, but let me try and keep it organized because we have public comment to get to and everything else. So Mr, is it Mueller or Mueller-Muller? Mueller, please. Thank you, apologies. I will use the raise the hand function. Thank you. So I wanted to add a couple of comments. One is the way we decide our measure focus is through our clinical committee structure. So we actually have a work group called the measure selection work group. We vet these ideas through that group who then recommends the measures of focus to our population health strategy committee, which is a committee of our board who ultimately takes those recommendations to our board. And I'll also add that we have a plan with Diva nearing finalization around a focus for our designated mental health agencies and their additional focus to improve a follow-up after emergency department visits. Okay, so there are plans not necessarily chosen by the data but chosen by teams and plans are being made. And in future years, we should be able to compare the pre-score and the post-score after the intervention starts. Is that correct? Go ahead, Mr. Mueller. Thank you. Yep, no problem. So I wouldn't say that it's not using the data, right? We've analyzed data and shared our findings to make these decisions. There are certainly lags around some of these data, right? There's a process where we have to vet these through different work groups, build policies, gain approval, get stakeholder buy-in multiple months required to build this up to the point where now we've got a framework. Let's take our framework, let's leverage it. We see a red square. I don't like red squares. Let's fix that. Yep, yep. And who drives that process? Who's in charge of creating those plans? Go ahead, Mr. Mueller. Thank you. Thank you. Well, we have leads for each of our work groups as well as our committees. So we've got a committee chair. But we have a subcommittee led. Overall, is it one care that's in charge of creating the process and quality improvement plans? We control what we incentivize for our members in agreement with our pay or contracts and whatnot. But we build incentivization and quality improvement support for our network. I can't speak to the payor side. Mr. Raines. It's hard to find accountability. Sorry, I couldn't hear that. I couldn't hear the question. I'm sorry. I said it's hard to find accountability. So in the interest of, I didn't think I had a lot of questions, but it's generated a lot of talking. In the interest of time, I wanna just skip ahead a little bit and turn it over to my colleagues. But it's, I'm troubled. This makes my head hurt that I don't see a clear process for deciding who's going to address what based on the data and a clear plan that someone is going to push forward to improve these numbers. And someone's gonna say, it's my plan. We made this plan. With Blue Cross Blue Shield, there were two measures chose. I've beat that up, I guess. I talked about it too much. But in the progress and challenges slide, great effort was made to say we can't link the ACO support of providers to reported outcomes. I understand statistically causality is a difficult thing. And without very specific techniques to account for what it takes for causality, people will always say there are other reasons. There's bias, there's confounding, there's lack of statistical significance. That problem is gonna stay, right? That causality difficulty. But there are other statements about the network is slow to adapt a mental health and a substance use metric. That doesn't require statistical analysis. Why is it slow? And I should just reflect that we are in the middle of a contract negotiation with OneCare and I don't think we should be having that negotiation in this forum. But I suspect, and I shouldn't be offering commentary on behalf of the network since they're here, people from OneCare are here, I suspect that there's just fear about taking on more work, taking on measures that are hard to affect. We heard about the complexity of getting at this data. We also know that primary care physicians have a lot on their plates and they don't love always doing these screenings. It can be awkward conversations. So it's a little hard for us to pin down. But I think this gets at the root of the question you were asking earlier on the commercial side, the measures are chosen through a negotiation process. As a payer, we have our preferred and OneCare comes to the table and we talk about what's doable and we end up with a compromise, which of course is less than ideal as compromises always are. The work plan that I described and the two measures that you've mentioned a couple of times are an attempt by us to get at some causality, right? To actually to understand from OneCare, what are you doing? What are the activities that you're pursuing so that we can dive much deeper in the data and see is there some place we can see that activity coming out the other end? If we know specifically what you're doing, which practices are doing it, which other practices are participating, then I think we have a fighting chance to start segmenting the data and finding that output. So that's really what that work plan is about to try to get there. Yeah, I'm interested in trying to find the, I said last week I would like OneCare and ACOs to succeed. I'm having trouble finding evidence of meaningful impact and that's what I hear you saying too. So I imagine that the difficulty with advancement of coordination of care and accountability, that would receive a similar reply. And so one last thing from this slide, I also share concerns about the transition of ACO data to the UVM HN. So I wanted to say that. One final question with MVP. On the slide of looking ahead for opportunities of 2023, there's the back one, the third bullet optimization of OneCare investment through population. What does that third bullet mean? So as I mentioned, part of our program with OneCare is that we provide a $3.25 PMPM as a primary care investment fee. And when OneCare, when we had discussed how that those monies are being used, OneCare did present to us their population health model, which Josiah just mentioned. So that's what we mean by optimizing those dollars that we've provided the past couple of years. Okay. Thank you all. And back to you, Chair Foster. Thank you very much. Next, Dr. Merman, did you have any questions or comments? Yeah, thanks, Ellen. I have a few, thanks everybody for presenting super informative and helpful. And I just have a few things. First, just something for Pat Jones. It's impressive to see the greater than 90% of benchmark for follow-up on drug and alcohol from ED visits, mental health. And I guess there was one other that they got points for that didn't meet it. The one question I had again on this engagement in alcohol and drug measures is, do you know if those measures include patients who have engagement with recovery coaches as opposed to, you know, billing clinicians? I do not believe that they do. Dr. Merman. It's a claims-based measure. So if it's something that can be billed, then it will be picked up in the claims. If it's something that's not billed, it would not be picked up in the claims. Because I mean, that's one area in all of this that we might actually be performing quite a bit better than we are because I think a lot of the drug and alcohol treatment is going through recovery coach programs, which I don't think in Vermont bill. So it'd be something to, I don't know how we can, maybe you and I can meet chat if there's a way to get that sometime. Sure. Andrew, I understand the influence that it's in my home office. I wanted to ask you a little bit about, you had mentioned that basically I'm really interested in how Blue Cross Blue Shield measures clinical outcomes. You say this is like a top priority for Blue Cross Blue Shield is a high performing clinical system, what do you guys use when you measure high performing clinical systems or really good clinical outcomes for patients? Is it the same ones that we're measuring through the ACO system or are there other things that you're tracking on these patients? Yeah, we measure many more, many more measures than we track for this particular program. We have a fully staffed quality improvement department that measures the whole population attributed, unattributed across a range of measures that are required by NCQA and then a number of other measures that are just interesting or important to us because of opportunities or challenges we know that have existed in Vermont. That team also runs 20 or 30 projects a year to go after measures where we're having some challenges. Unfortunately, it's not my team. So I can't go too deep on that with you, but those folks would be very happy to talk to you about the things they measure, the way they set their agenda each year, the projects that we run to try to make progress. I think that'd be really interesting to learn about what are the measurements that you are learning and experience that really matter to be tracking? I also was curious though, if you at Blue Cross with SHIELD, you mentioned cost is sort of the second, I don't want to say it's the second tier, but it was the second in order. And do you measure or evaluate or how do you measure and evaluate cost and affordability of your insurance plans for Vermonters and how have recent times sort of changed? Is there a way you track the affordability of those plans and if people are falling off those plans and if you can comment on those measures? Yeah, looking at the clock, I'll have to say that the short version is no health plan in Vermont is affordable right now. I'm amazed that we find ways to afford them, but our perspective is we need to lower the cost of care wherever we can do it. So when we're thinking about cost savings, we're really looking at progress from the present. I do think there's been some attempt to establish a framework for Vermont that we should at least slow the rate of growth. We push a little harder than that and say, no, we think there are actually opportunities to reduce costs from where they are today if we can get coordinated and focused on them. So our way of evaluating, it really is to benchmark where we are today and to ask ourselves, how are things moving? Can we get anything moving down or where we have long established rates of growth? Can we slow those? What is it gonna take to bend those down to the point where we start to reverse the trend? Could you share with me some ideas on how you think we could actually decrease the cost? Where are those areas that we're looking at? Yeah, so really quickly, we know we are spending a lot of money on avoidable emergency room use. We know that despite our best efforts, we're still spending a lot of money on some expensive drug therapies, both on the retail side and particularly when delivered by a physician or a hospital, so a hospital administered drug, lots of opportunity there. Those two jump to the top, but our quality folks I think would list out for you a half a dozen measures that would say, hey, we still have lots of people with diabetes who are out of control that are sort of moving through the system in these acute sort of crisis driven modes. We need to get down to root causes on a lot of the people living with chronic conditions and manage all of this a whole lot better. So I think we could look, frankly, almost anywhere. We can also look at the way that we rationalize care across the system. We don't always steer the care to the most cost effective care delivery site. And in a lot of cases, we have routine low complexity care happening in places that are designed to handle anything but, right? And all the cost that comes with that preparedness we're spending when we don't really need to. And then we're leaving some really good, low cost, high efficiency shops sort of underutilized, right? We need to be driving more patients to some of those. So I think there's a huge range of things we could talk about here. Yeah, that was my next sort of... Real quick, I'm gonna interrupt. Ms. Grace Gilbert Davis had her hand raised, I believe from memory, she's from BCPS too. So I just wanna turn to her. Hi, thank you, folks. And I am the one that has the flu. So you'll have to apologize in advance to my voice. So Dr. Merman and others, Blue Cross is gonna be presenting to the primary care advisory group, I believe in February, to talk about all of our value based pilots programs and learning collaboratives and what we're doing to improve quality and reduce cost. And I would encourage you to attend that meeting if you can. For sure. You know, I think with time in mind, I think I probably should move on to the next person, but thank you very much. Okay, we're gonna take things a little out of order because of the hard stop today. And I don't want to interrupt public comments. So my fellow board members, I apologize, but I'm gonna turn to the public for comment. And if we run out of time, myself and the other board members can submit written questions to the panel. So again, I apologize. But if the public has any comments, please raise your hand now and we'll take those in order that your hand is raised. Ms. Gilbert Davis, your hand is still raised that because you have a public comment or was it just left up? Oh, it's down. All right, thanks. Left up, sorry. Okay, no, no worries. Okay, great. All right, well, we'll go back to board questions and seeing there's no public comment. Ms. Lunge, do you have any questions or comments? I have nothing burning. So in interest of time, I'll go ahead and skip any questions. Thanks. Thank you. And Ms. Holmes? I actually just have one question. Maybe it's directed to Blue Cross Blue Shield, although I'm curious to hear if MVP has any thoughts on this. It's really obviously very, very challenging to assess performance over time during a pandemic. As everybody's outlined, there's too many confounds. And so I'm just wondering, Mr. Garland, I know you spoke a little bit about analyses you've done to compare financial and or quality performance over time between ACO and non-ACO attributed patients, which seems like an interesting strategy to try and get rid of some, at least some of the trends in the pandemic that might be affecting the results. So I'm wondering if you can just a little bit talk about, if you've done match sample analyses, comparing patients who have always been attributed to the ACO versus those who have never been attributed to the ACO. And what sort of analyses you were referring to when you mentioned that type of work? Yeah. I'm not gonna be able to go into as much detail. Oh, just moved it, there you are. As I think you would like to hear, but we can certainly follow up with you on some of the details. But I think the short answer is that we've looked at it a lot of ways and we haven't gotten there yet, but we're continuing to try to find something. And increasingly the strategy that we're looking to more is the sort of segment data to assume that maybe when we look at all 90,000 lives, things that are moving deliberately as a result of somebody's work are a little bit obscured. But if we could start perhaps to break these populations apart, look at the practice level or cut it in other ways, we might begin to find pockets where we actually see sort of year over year trend. But let me follow up. I know we have thrown out the suggestion to do a match comparison study. It's a pretty complicated, well, it's costly. So I'm not sure that we've gotten there yet, but let me find out for you. Great, thank you. That's so much. And in the interest of time, I will turn it back over to you, Chair Foster. Thank you. I only had one question because I wasn't sure if I heard it right, but Mr. Garland, I think it's around slide 32. There was a mention about care management and care coordination efforts at OneCare. And my question was, does Blue Cross Blue Shield have care coordination strategies or tactics that it uses that it could share with OneCare? Well, the first part of the answer is for sure, we have care managers, nurses who work on our staff that do their best to connect with patients that they think based on the data we have flowing into us are likely to need support. So a combination of factors, what they're diagnosed with, what they're experiencing and who might be available to support them. And those care managers do direct patient outreach. We also have, as I mentioned, earlier quality department that is running many, many projects every year to try to improve those quality metrics, the bottom line stuff. And often that work comes together. I think care coordination, as we understand it, is meant to take that work that we do and plug it in to the work, similar work that is happening, let's say at blueprint sites or at primary care practices where resources have been brought in to move into the population health management space and that care coordination will begin to align and coordinate all of those efforts. So we're not strictly in the care coordination business, we're in the care management business, but it's our anticipation that we will move into that care coordination system as it's designed and stood up around the all-payer model and the one care construct. Well, I'd just encourage, if there's been a concern about the speed with which progress is happening, if there's learnings you can leverage from your organization to one care and part of your work with them, I would encourage that. Mr. Mueller has his hand raised. I'll call on you, Mr. Mueller, please go ahead. Thanks so much. I just wanted to add that we've done a lot of consideration about the importance of care coordination with our policy development for 2023. So we actually consider it a gateway for payment in our programs for both our primary care providers as well as our continuum of care providers. And we have some specific criteria that they're required to meet in order to be eligible for payment. So that's the first comment I wanted to make. I also wanted to comment on our commercial quality scores because I didn't actually see the eye chart specifically for Blue Cross, but I'm looking at our scorecards and I have two scorecards here. I see 15 measures. Of the 15, I see nine measures above the 90th percentile. I see three measures about the 75th percentile, two at the 50th and one below the 25th. And if I look at MVP, seven measures, I see five measures above the 90th percentile, one above 75th and one above 50th. Thank you. And with that, I'll turn it to the healthcare advocate for any questions or comments they may have. Thanks so much, Chair Foster. I'll be super brief as I know this is a hard stop. Thank you to everyone that presented today. It's helpful to see these outcomes and results in one place. Not really some of them. I just want to point out, I think others have made this point, but maybe a variation of it is it remains a shortcoming that we don't have an independent causal evaluation of the results that are presented. So we don't really know what normal expected variation year to year is. And we don't know if positive or negative results would have happened with or without the APM or OneCare. And we really need this to assess effectiveness, cost and value. And I want to recognize, I mean, this isn't a requirement from CMS to do this type of work or this type of analysis, but I think Vermonters and all of us here today, I think would hope that this is something we should, just because we're not required to do it, doesn't mean that we shouldn't try to do it as a state. And I think we have the capability and capacity to do it. Thanks. Thank you. And seeing as we have eight more minutes and there was some public comment that didn't get in, I'll turn to that. And I apologize for the ordering today, but Mr. Hoffman, I see you have your hand raised. Please go ahead. Thank you. Can you hear me, Chair? I can, yes. How are you? Good, thank you. I just wanted to interject regarding the understandable complexity that a pandemic added to all this. It is clear and we had Blue Cross Blue Shield today talk about, I think what they felt was a fidelity to the spirit of the agreement that they wouldn't take money or return money for results related to underutilization. And I would just ask why the same can't be said for our public payers who are ultimately funded by taxpayers. But more importantly, I recall the early days of the pandemic thinking, obviously, utilization is gonna go down. The telehealth supports came online almost immediately. I work in the mental health and substance use field. I had alerted the board during that time. How great would it be if we put all these investments to work, we can stratify, we know exactly who our mental health, substance use and potentially suicide patients are. We could pull codes to see if they ever had inpatient related to mental health and we could preemptively go out and encourage our HSAs to be going to these folks with additional telehealth supports. It would have been a great time to sort of pivot to the work that we could do as opposed to just taking savings for work that we tragically weren't able to do. And it's just, it's peculiar to me that that's never been touched on through all of this, that that's work that could have been done where lives could have been saved because as recently as this week in one, in VT Digger we're seeing that there's still many lives being lost to opioid overdose, which is expected when people have been isolated. So I would just provide that to all of you for consideration why that was not more preemptively looked at and addressed and what could be done in the future to make sure that something like that is addressed. God forbid we go through another pandemic like this. I'll also just point out that I think it's pretty clear as of today it's been established that there's not clinical data available in real time for quality measures which these folks are measured against. And so I would just submit to you with small end annual abstractions. Is that sufficient really to provide ongoing quality improvement throughout a year? Should that not make us question as member Walsh did last week, are we not better off championing Blue Cross Blue Shields efforts to pilot accountable care payments? Other federally qualified health centers who can already receive value-based payments through their waivers. Wouldn't it make more sense for those who have the EHR clinical data to actually be doing the work and be entering into these contracts who are capable of affecting that change as opposed to an overly complex sort of Kafka-esque arrangement that as of today we're alerted does not have the clinical data necessary to provide ongoing quality improvement. Thank you, Chair. Thank you very much for those thoughts and insights. And thank you all the participants for being here and presenting this information. You did a great job and it was very helpful to myself and I'm sure the other board members. And thank you to the OneCare folks for raising your hands and stepping in to provide some information on the fly. It was appreciated and with that I'll turn to whether or not there's any old business to come before the board, any new business. And is there a motion to adjourn? So moved. Second. All those in favor, please say aye. Aye. Aye. Motion carries. Thank you all have a wonderful day and a happy Thanksgiving to everybody. Drive safely if you're traveling. Thanks. Thank you.