 Executive Director's Report, Susan Barrett. Thank you very much, Mr. Chair. I have a few announcements. First, for the schedule for the rest of November, this evening, we're meeting today, and then this evening, back at the GMCB offices, we are holding at five o'clock tonight a primary care advisory group meeting, as I said, starts at five. And then we have a potential vote scheduled on Monday on the benchmark, and then the most important thing for this week, we have the Rural Health Task Force meeting up is the Johnsbury tomorrow, tomorrow, not all day. Yeah, up at the hospital. So if folks are in the area and want to participate, I think that would be a well worth venture. The other announcement is just to update folks on the open public comment period of the ACO budget. We have moved it out based on the information we're gonna hear today and some of the presentations and the timeline. So right now, that open public comment period ends December 2nd, and I would encourage folks to check out our website where we have listed all the public comments on the ACO budget to date. And that's all I have to report. Thank you. Oh, and actually, I do have one more thing. Okay. General Counsel Barber has an announcement on a great decision that he needs to read. So, it's next. Yeah, I just need to announce the board decision regarding MDP's 2020 large group risk filings on November 13th, 2019, the board issued a decision on the filing submitted by MDP Health Plan 8 for its 2020 large group HMO products, docket number GMCB00819RR, as well as the filing submitted by MDP Health Insurance Company for its 2020 large group point of service rigors, docket number GMCB00719RR. The filing's impact approximately 1,800 members in Vermont, the board ordered the carriers to one, adjust their unit cost trend assumptions to reflect Vermont hospitals approved FY-10 kind of budgets, add a 19 cents per member per month that had been inadvertently excluded from the rates for changes in benefits, three, remove a $1.82 PMEM load related to expected agreement with one care per month, and four, reduce the proposed contribution reserve from 2% to 1%. These order changes reduce the average rate increase across all quarters of 2020 from 15.7% to 14.9%. As the board knows, these large group products are not currently rated, and so the premium increase that a group will experience will differ from the group rates based on the group's claims. Thank you, Mike. The next item on the agenda are the minutes of Wednesday, November 13th. Is there a motion? So moved. It's been moved and seconded to approve the minutes of Wednesday, November 13th without any additions, deletions, or corrections. Is there any discussion? Seeing none, all those in favor, signify by saying aye. Aye. Any opposed? Okay. So the next item on the agenda is the HIE plan. We'll welcome Sarah Gensler to come down. And whenever you are ready, Sarah, take it away. Thank you very much. For the record, this is Sarah Gensler, Chief Executive Director of Strategy and Operations as staff of the board's work related to vital information technology and health information exchange. The board has three major oversight responsibilities related to vital HIE and HIE. Can you put the microphone closer so the people can hear you? Is this better? Is it on? Can people hear about that? Not really, well, I'm here. I'm going to button up. If I crop really loudly for the beginnings of that, can folks come back here? That's good. Yeah, okay. So the board has three major oversight responsibilities related to vital HIE and HIT. The first is to review an improved vital budget, which we do on the late spring every year. The second is reviewing the state health information technology plan, now known as the HIE plan. CHEVA's required to revise the plan manually with a comprehensive update every five years. So we're in year two of that cycle. Finally, the third responsibility is reviewing the connectivity criteria of the providers connected to the Vermont health information exchange or HIE. Vital is required to present criteria for approval annually before March 1st, when we started to review that annually in conjunction with the HIE plan. The board approved 2019 connectivity criteria in November 2018 when it approved the HIE plan to support the board's review of the HIE plan and connectivity criteria. Staff have developed the following principles for review. These are the same principles that we recommended for review of the 2018 to 2019 HIE plan and we've adapted them for this year. So there are four criteria for the HIE plan and two for connectivity criteria and we'll walk through each of those as we review the staff. That's last night. Excuse me. Could you just put the mic below both of you? Yes, absolutely. I'm still gonna use that for the board. All right, so a quick reminder of the process. Diva submitted the HIE plan to the board on November 2nd and we called a two-week public comment period from Monday, November 4th for Friday, November 15th. On November 13th last week, Diva and Vital presented the HIE plan and connectivity criteria at a board meeting and then today we've noticed a potential vote. So any more questions before we start to welcome from the staff analysis? Thank you, Rhett. Good to go. So now I'm gonna walk through each of the principles for review on the staff development. Title 18 describes requirements for the HIE plan including supporting use of electronic health information, educating providers from the public, supporting interoperability, proposing strategic technology investments and recommending funding mechanisms, incorporating and integrating with existing initiatives and systems and addressing governance and security. As submitted, the 2019 update to the HIE plan meets each of these criteria building on the groundwork of last year's plan. In particular, this year's plan at the roadmap for technology investments in the short and medium term. In addition, effective March 2020 statute also specifies that the HIE will use an off-the-out consent model. The HIE plan demonstrates the price for meeting this requirement on time, although as of last week there is also some workflows to be finalized for that date. The second criteria is alignment with the 14 principles for health care reform that are better described in Title 18. In its 2018 decision to approve it, the 2018 to 2019 HIE plan the work found that the plans spoke to several of these principles and these areas really still remain core to the 2019 to 2020 plan have not changed. I included a few examples here of principles that are particularly relevant to the HIE plan and how I think the HIE plan meets those principles. Third principle for reviews consistency with relevant legislation this year that is mainly after the degree of 2019 which focused upon HIE consent but will also touch on Act 73, Act 17, Act 20, 7, and 19 both of which focused on HIE effectiveness and operations and governance. I believe that the HIE plan meets this criteria, particularly Section 4 and 5, Act 53 which require the HIE plan to reflect and opt out of presumed consent policy effective on 1st 2020 as we previously noted. That will supersede parts of the current HIE consent policy approved by the Secretary of Administration and the Board in 2014. Diva reported at the November 13th meeting that the workflows for that process are still in development. For this reason I'll be recommending that the Board request an addendum to the HIE plan that reflects Act 53 and will be put into effect prior to the March 1st implementation date. At this time, or at that time, excuse me, once you are a member of the placement, I would also recommend sunsetting the 2014 policy element completion related to Act 73 and 187. The HIE plan also demonstrates continued efforts to improve HIE governance, operations, and effectiveness which continue to be core goals of the HIE plan. Finally, the Board principle for review asks whether the HIE plan incorporates national best practices and expertise as well as feedback from the monitors. On the 1st question alignment with national best practices and expertise, this year's HIE plan really builds on the foundation of last year's comprehensive update. That update utilized national standards and models for HIE governance, technology, and financing and really drew from models from across the country from states and other localities with particularly effective and successful HIEs. These best practices remain core different and independent on the HIE plan. Regarding stakeholder engagement, Diva and the HIEs here in the community have soft significant stakeholder input on both the HIE plan and technical roadmap as well as on the HIE implementation from stakeholders. The HIE plan itself includes stakeholders from a variety of HIE constituencies, including primary care, hospitals, mental health, substance use improvement, public health, ACO, payers, technologists, and consumers, as well as handful of state entities. And in addition, Diva staff and contractors sought feedback on the HIE plan and technical roadmap from providers, including the Board's primary care advisory group and a number of other stakeholders which are listed on the slide. Diva has held also four focus groups with individuals who actually use the HIE, including care coordinators, data analysts, technical architects, and payers. On consent, Act 53 required very significant stakeholder engagement in HIE consent policy implementation planning. As you heard from Diva, they made a pretty large effort to engage stakeholders representing all parameters, including the ACLU and HDA, as well as groups representing populations who might have particular concerns related to privacy or require different communications with this. So this concludes the staff analysis of the HIE plan. Are there any questions before we move on to the connectivity criteria? Any questions from the board? Great, thank you. So now for the connectivity criteria, there are two principles for review here. First, does the criteria align with the HIE plan goals supporting HIE plan implementation and support the state health performance goals? And I think it suggests the 2020 criteria bill on the 2019 criteria, which didn't meet the standard, and have added additional data elements to the minimum tier two data set to further align with state-fibre program needs and defined tier three criteria. Secondly, are the proposed criteria there enough to be operationalized by vital state and providers? The 2019 criteria were really developed for this goal in mind, and it includes specific standards and requirements that helped Vermont providers negotiate with DHR members which had done the maintenance of the criteria prior to that time. The 2020 criteria maintained the structures and added some further clarity, especially for tier three. Are there any questions on the connectivity criteria? Doesn't appear to be. All right, so finally, board received two verbal public comments at the November 15th meeting from the Office of Health for Advocates and Vermont Elemental Disabilities Council. Both complimented Diva's stakeholder engagement efforts on the HID consent policy change. Those are the only two comments we received from the public who did not receive any written comment. So in light of these comments from the public comment, the staff recommendations to approve the 2019 to 2020 HID strategic plan as submitted with the following condition. To comply with section four of Act 53 of 2019, Diva shall return to the board prior to March 1st, 2020 to propose an addendum to the 2019, 2020 HID and effective March 1st, 2020 to reflect object consent and document how object consent will be marriage. And in addition, the staff recommendations to approve the 2020 HID criteria has submitted. That's all. I have a question from the board. If not, is there a public comment for questions? I don't see any. Does the board member wish to make a motion? I think it's appropriate to do two different motions. I think Tom's ready to make a move, Steve. I need some practice here about this. All right. So I recommend approving the 2019-2020 health information exchange strategic plan as submitted with the following condition. To comply with section four of Act 53 of 2019, Diva shall return to the board prior to March 1st, 2020 to propose an addendum to the 2019-2020 HID plan effective 3-1-2020 to reflect object consent and document how object consent will be marriage. Is there a second? Second. Is there a discussion? Seeing none, all those in favor of the motion signify by saying aye. Aye. Any opposed? Okay. Would someone wish to make a second motion? No. I recommend approving the 2020 and activity criteria as submitted. Is there a second? Is there any discussion? Seeing none, all those in favor signify by saying aye. Aye. Any opposed? Thank you. Thank you, Sarah. Thank you very much. So next, we're going to invite Sarah Lundberg and Michelle McRae down. We'll be talking about the 2020 Medicare benchmark. And just make sure you're close to the mic for using your extremely loud voices because for some reason the sound is not hearing well today. All right, so Sarah and I are here to talk about, I'm going to give a quick all-fare model update and then Sarah's going to walk through the Medicare benchmark recommendation. They came up with this wonderful slide. So we just wanted to really give an update to the board and the public on timing of reports. As we are expected to be producing several, as you can see in item number three, we've got five upcoming deliverables. But we also gave some changes to annual reporting, which we haven't brought to your attention before. And I just wanted to highlight, we did add an annual sort of cost of care report to allow for adequate claims run out. We also have been approved to move our state-wide health outcomes and quality of care report to the end of this calendar year. And that's due to the many factors, one being claims run out and another being available to view data from our external data sources, including breakfast, CDC, fun things. We've got a couple of great issues that we're working through. As everyone knows, Blue Cross has a new claims processing system and that change has created some delays in BQ's data. And we're also working to procure a risk grouper and that has to do with our payer differential assessment report, which is also due to be completed by the end of this year, but we will be requesting an extension due to that per Q&A. Any questions? Hi, my name is Srella Berg. I'm a health services researcher with the human care board. And I'm here to review the staff recommendation for the 2020 Medicare benchmark. So just as a reminder, what is a benchmark even for that word around a lot, the benchmark for Medicare is a financial target that is set for each participating ACO in the Vermont All-Payer Model ACO agreement. So we only have one ACO operating at the moment, so there's only one target that we need to set for 2020. And basically what happens is a target is set, which was demonstrated by the screen dotted line. And then the ACO's performance is assessed relative to that target. If it's below savings, the ACO keeps the dope. If it were to go above that line, it would be a loss and the ACO would be paying Medicare back. So it's very similar to how the other programs work. But what we do that's a little bit different maybe than other programs is we're setting two different targets. One for beneficiaries who are living with end stage renal disease and one a separate target for those who are not eligible due to that. The reason for that is that while they are a small population, they're extremely expensive, so it adds in some risk mitigation for the ACO. And so the benchmark for each of these subgroups has three main factors. An estimate of historical experience, a number of prospectively aligned beneficiaries, and a trend rate. So the historical experience is basically our best guess about what this population would cost to date. The tricky part is that 2019 is in progress. So we have to do our best to estimate what we think the 2019 experience is gonna be in order to apply in annual trend rate. That is what the board votes on. This is what the board is trying to estimate the growth will be for the upcoming year. And then aligned beneficiaries are the factor that we have the least control over. That's gonna be folks who attribute to a Vermont provider. According to the data sharing agreement we have with the federal government today. Some people who attribute to Vermont providers don't necessarily live in Vermont. We are only allowed to have data for folks who live in Vermont. So we don't have the full census of the ACO population today. So we wanna make sure that the data we do have available represents the ACO population but it's not going to be a one-to-one. We are working to get a different data sharing with the federal government which would allow us to have the full picture. So that is the benchmark. So historically speaking, the board didn't have the capability to work on setting that historical experience number itself. And so it worked in partnership with the federal colleagues in order to get that best estimate. In order to better fulfill the duties of the agreement and care board as envisioned in the agreement, we're actually taking on that task this year and using that data to do our best to estimate what the experience seems like it has been for folks who would attribute to one care for 2020. And one critical difference in the way we're approaching the modeling is that we think it's really important that we say, okay, so based on this 2020 network, who would have attributed to the ACO back in 2011? NC. Three friends repeat who would have attributed in 2012 based on that provider network. So then that way you're looking at a series of rolling cohorts or rolling population over time. And as you can see that looks much different than if you say, okay, who is going to be attributed in 2020 and what is their historical claims experience? When you look at the historical claims experience for the actual bennies, you can see that as they get younger and some of them haven't been eligible for Medicare back in 2011, we're really underestimating the historical spend. So to best represent the risk the ACO is facing, we need to look at these rolling populations. And the main difference is that because it's a Medicare population, we're expecting a certain amount of mortality every year. So folks are going to die. And those end of life care costs are quite expensive. We estimate around $4,000 per member per month for beneficiaries who pass away during the performance year. So that's one of the big deltas we see as these kind of timelines converge. But we know that everyone in this 2018 bucket is still alive because they're going to be attributed in 2020, whereas folks have passed away during the course of 2018 when we take the cohort for the two. It's probably a lot of just boring detail, but it's a really fundamental important assumption that we've taken in our modeling. The other thing that we've done is said, you know, some exclusions can't be calculated during a performance year. As an example, if it seems like my primary care relationship is with the Vermont doctor and I was aligned to the ACO, but it turns out I end up getting the majority of my primary care in Boston, they're in a protection built into the network where that person is taken out of the population for some of the purposes. Saying, you know what? We thought you were going to be on the hook for them, but it turns out that's not really your patient. So we're going to take them out. You can't figure out who delivered most of the care, whether it was in the Vermont network or somewhere else until after the year is over. So we think it's really important to use complete claims so that when we can model the final settlement values, so that when we are setting a target ahead of time, it's the same number of that ACO or participating provider would expect kind of settlement. So those are kind of some of the important things that we're doing when we're modeling our estimate for the experience. The other second component of our equation are the prospective banks. Again, that's the number that we over which we have the least control, but there's really two different numbers in play here. There's not more than that, but for this particular thing there's two. So there is kind of this prospective benchmark and or scale population. So that's the full number of beneficiaries who are attributed to the ACO. Attribution is completed six months ahead of the performance here. So for this 2020 population, they were attributed back in June of 2018. Yeah, so it was there back in June of 2018 is when we cut off the experience. So not everyone is gonna survive until, no wait, 19, sorry, 19. I've been looking at too many years in the concert. So start over. All the beneficiaries who the ACO has on the list for 2020 won't show up in January. So some of them will die between June of 19 and January of 20 or might lose eligibility for other reasons. Maybe they move away out of the country. Maybe they sign up for Medicare Advantage. Maybe they get a job and Medicare's no longer their primary payer. So this prospective target is the ceiling of the possible patients that ACO will be accountable for. Come settlement time, that number goes down and it always will go down. So once one way in rail, you can only attribute offered or a tricked off rather. So this is our estimate for where we think through September, the number of people who have already lost eligibility stands for settlement. That number will continue to go down for people who lose eligibility in October, November and December of 2019, as well as after the years over we'll figure out those folks who ended up getting most of their primary care somewhere else and they will be taken out for settlement purposes. So that is very much a moving target and again, the one over which we have the least kind of information, we can do our best to kind of estimate the data available about what we're missing, but it does seem that there's not a huge difference in these populations. So the data for whom we, the folks for whom we do have data seems to be pretty similar to the ACO wide data that is shared with us from our federal partners. And last but not least is the trend factor. So that's the real decision point. So again, that's where the board says, okay, what do we project or what are we guessing is a fair amount of growth for the ACO to expect from 2019 to 2020. And there's some gargoyles around this according to the out there model agreement. So every April, Medicare Advantage puts out what's called the final announcement and what that does is it has kind of the estimates of rates for people who want to bid to provide Medicare Advantage for their capitated rates. Capitated is a stretch, but for their base rates. And those estimates include fee-for-service projections. So these are guesses for the full national Medicare population over time. And what they, nationally what they said is given in April of 2019, given the best data available, we think that per beneficiary per month in 2019 for those folks who are eligible for reasons other than N-stage renal disease will be just over 900 bucks. And we think that's gonna go up to 941 in 2020. So that projected growth, again, this is just a guess for the national fee-for-service population is that it's gonna go up by 4.16%. Similar exercise for the ESRD beneficiaries is 3.14. So the agreement says that the highest trend that the board is able to use for this is 0.2 percentage points below this. So basically the ceiling on the allowable trend available for the board to use in calculating this is 3.96 for the non-ESRD benchmark and 2.94 for the ESRD benchmark. I will say that Vermont growth has historically been less than the national projections and in recent years, even national growth have been a little bit less than the national projections, but guessing's a hard thing to do. Some years you get it right, some years you're way low, some years you're way high, but hard-telling not knowing. So again, based on the data available to us, the recommendation that staff are making is that for the non-ESRD population, that a trend rate of 3.5% should be used to set the Medicare benchmark for the ACO. The rationale being that as the scale increases for Medicare, this is being more and more influential on our all-payer target. So there's kind of two things we have to do. We have to stay below that 0.2 projection and we have to convince our federal partners that it also will allow us to meet our goal of a total cost of care, all-payer cost of care growth of 3.5%. So this $9,765, that's a per-beneficiary per year estimate. That is subject to change as we are having more data available to us and it's something that we're still sorting through. Multiplied by the prospective bennies and the recommended trend rate would be beneficiary per month target of $842. So the idea would be that at settlement, this number would stay fixed and the only number in the equation that would change at settlement is this number right here. So again, that's the people that have lost eligibility along the way. So that number just goes down at settlement but we would wanna lock in the per-beneficiary per month. Have you done any calculation on the volatility of that 72.327? Volatility. What would be the range one way or the other that that could change? Oh yeah, so it'll only go down. And this is who is attributed. This is the 2020 population. So that is fixed. It will go down. It should go down less than it did in 2018. So I would guess probably about 85% of that is what's gonna stick around for settlement purposes. So 85% is the best guess? Yeah. Sarah, you touched on something that I think might be important to make more explicit which is while the board votes on the trend factor, it is still subject to CMS at Google. That is true, yeah. So it's not up to us. What we do is we make a proposal and then CMMI says, yes, go ahead or says no and we have an opportunity to revise it and resubmit it. It must be approved by CMMI before the end of the calendar year. Thank you. No problem. Are there any guard rails to the approval? So the things we have to approve is that it's staying within 0.2% below the national projections and that it allows us to meet our all payer target of 3.5%. There's also some language about making sure it doesn't create any unfair discrimination or something like that. So there's a few other kind of details that we need to prove but the too big from a finance perspective, our all payer 3.5 is achievable with our proposal and that we're using 0.2 below the national projection. So can we just talk a little bit about, we're using the trend rate of 3.5. The actual that came out would have been 3.96, right? So the maximum allowable was 3.96. Correct, yeah. And then I just did a quick calculation just to see what that would be off as 731. Well it's like 3.3 million dollars. And part of that is we're saying A, because of the scale that we're achieving as well as is it compared all about the prior savings that we've gotten in the past two years? Yeah, and also how it pulls into our all payer targets. So this number for a financial accountability perspective for our obligations under the agreement is actually quite a bit higher. So we have to include everybody statewide. We also have to include in savings as part of our spend. So we need to make sure that all those things combine and keep us under that 3.5% compounded annual growth. Any other questions? All right, so ESRB again, many fewer beneficiaries. And the advice to review is the maximum allowable trend, which is 2.9%. This population is extremely small and extremely volatile. So we do think it poses relatively more risks. So we would like to recommend that you use the maximum allowable trend for the ESRB population with the same caveat that the experienced number is still being refined, that might not be the final proposal. And then finally, in the past few years, there's been a shared savings component to the benchmark. And the recommendation here would be to support the ACO and continuing its funding of the blueprint in SASH. So in their budget, that was just under 8.3 million, 8.29, and we would recommend that we include that in the benchmark. And CMMI has agreed that they would continue to advance those savings to the ACO so that they may pay them out for the blueprint. It helps with some cash flow. And then the understanding would be as it was in 2018, that would be factored into the settlement. So any savings on top of that would be what the ACO would net. So from a calculation standpoint, that's what it shoulders out to. The seven, just under 759 would be a much higher number than you saw on the ACO budget. And that's entirely driven by attribution. So their attribution numbers are quite a bit higher than they projected in their budget. What other questions, comments, or concerns can I try to address? You might just explain that between the ACO budget and what you're projecting here, what were the moving parts and attribution? Because you and I agree that you have a performance depending in terms of pricing per member per month, but the actual attribution numbers have changed quite a bit. Yeah, that was, I think, a surprise to everyone. But the major changes to the ACO's Medicare network was Springfield no longer participating, and the addition of several FQHCs into the Medicare line. And the thinking is that the FQHC brought in quite a few attributed lines. I just want to make a comment about the blueprint and SAASH piece because that was really an important part of the original negotiation, was ensuring continued Medicare support for both the blueprint and SAASH. With SAASH in particular, that program is almost 100% supported by the Medicare dollars. There is some state dollars, I believe, on the administrative side, but so I think that's a very important piece of this whole calculation. I understand the important piece of the calculation. I'm just questioning why the link to the benchmarks since it's in the budget. Because that is how CMMI is able to fund that investment is through the benchmark calculation. They otherwise don't have a pot of money that's sitting around any longer to fund those demonstrations. And quite frankly, we've been off, I think, the negotiating team's first choice, but it was what was allowable under the federal provisions. Other questions for Sarah and Michelle? Is there any public comment, Walter? As someone who has been taken out before, meaning I've lost my insurance, I'm curious what the phrase take out means in this case. So for financial accountability purposes at settlement, who stays in for the ACO's financial settlement are folks who remain eligible for the full 12-month period, which means they retain both parts A and B of their Medicare. And Medicare is their primary payer and that they receive the majority of their qualified evaluation and management primary care services within the ACO network. Or all that was true up until the point in which they died through the course of the year. So it's people who were eligible the full time or who were eligible up into the point of their death. So those are the two groups that stay in for settlement purposes. Now the ACO is still accountable for anyone attributed to them, but it doesn't factor into the financial settlement. You get strictly Medicare population. Yes, sir. I'll give you a little bit more. Excuse my ignorance, I'm new to this. What does this ESRD actually stand for? End stage renal disease. Okay. Chronic kidney failure. Yeah, very expensive. Yes. Other comments? Susan. I'm Susan Arredoff and I'm the Vermont Development Disability Counsel. I noticed a new phrase both in this document and in the savings, I'm sorry, in the 2018 results document that was posted that's gonna be discussed this afternoon. It's the seems new to me phenomenon of referring to the funds that pass through from the feds to the state to support blueprint and sash. Those funds are now being referred to as something called advanced shared savings. And they're showing up in the Medicare savings for today's slide as if they were savings from operating the Medicare ACO program. So I'm just wondering if you can tell me, Mr. Chair, is this a chain to term that the Green Mountain Paramour came up with? And if so, when? Or was this a change in terminology from our federal partners? So I don't know where that's a change, but I'll refer to Sarah Lindberg like to get a little better answer question. Whether or not we were consistent in our nomenclature, that's always what it's been characterized as. So even in the agreement, the seven five is characterized as advanced shared savings. The US only came at seven five. So in the all-year model agreement, $7.5 million is earmarked as advanced shared savings to represent the historical investment in the MAPCP program demonstration. And that was advanced and then netted against their 2018 performance. So they said, okay, we're expecting you to have saved at least as much and ended up, so that was factored in to their performance. So I think the gross savings was around $13 million, but after you take off the 7.7, they netted about 5.5 million. So the money, I'm just trying to understand, my understanding is that that's money that comes into the state, that gets distributed through OneCare. We lost even an hour of losses the money coming out through OneCare, but it goes to non-participating communities as well. So I was surprised to see that money being spent, just that's passed through OneCare was somehow being attributed to OneCare savings. It is. So they're at risk. So if they were to save less than $8.2 million, they'd be paying that money back. And it might be helpful to reference the federal evaluation for the medical home programs, in which prior to this agreement, the blue grant was funded by Medicare through MAPCP, which stands for Medical Home Program. And the federal evaluation of those programs across the country, largely were inconclusive in terms of savings. The blue grant and SASH, however, had very good federal results that did show Medicare savings. And so the reason that the federal government was willing to continue that investment by advancing the shared savings was because they expected that that underlying program, which the ACO is required by law to build off of, those savings are built into our projections. So it has always been considered and part of the shared savings component and the ACO has been at risk, even though you're correct, there are participants who receive those dollars who are not ACO participants. So I would like to renew Mr. Chair. I believe I sent you an email about this in the past and I'll follow up with another. It would be really helpful to get the Medicare results in the format that we're used to seeing them, which has something like the total expected cost of care and the actual total cost of care, because throwing in this Medicare money that's just being passed through to the state really makes it hard for us to look year over year, just in general, the years of operation of this particular ACO OneCare to see how it's performed in Medicare. It's been in Medicare ACOs of one type of another since 2013. And by now blending these advanced shared savings into the charts that show savings, Green Mountain Care Board is really adding a new element in its data reporting that makes it very difficult for people like me who are just trying to track this, the numbers, follow the numbers, and do the math, you're changing the format. So if we could get the total cost. So again, I'll just repeat, I don't think we are changing the format. Well, and we can only provide Medicare payor results in the format that Medicare is a payer is willing to provide them. This is not an area that we as a state have authority or a control level or is an area with which Medicare is a payer controls. So is Medicare along with providing me the total actual cost of care or the expected cost of care? I think that was for the results presented today, right? Yeah, that's not the end result. It's been a second one. Medicare, so another factor in this particular area, I don't know if you're aware of this or not, but Medicare, the Innovation Center, has a great website that shows where innovation is happening all over the country. And you can click on the innovation website and look at the results of the Accountable Care programs and the shared savings programs and the Fondle Payment programs all over the country, all different kinds of exciting innovation going on. And in the past, until Vermont had its own unique product, you could click on the results of all of the next generation ACOs or all of the any type of ACOs that Vermont was participating in. And you could find CHAF, the Community Health Centers results on the chart with all the others and you could find one care. But now, because one care is in this unique product and Vermont is so unique, we're in this unique product, you can't find those results any longer. And I have emailed with CMS, won't surprise you, I'm not really getting responses, but I haven't given up. But I've been emailing with both CMS and with you folks to try to get the kind of numbers that we used to have, which were the basic, what is the total expected cost of care? We're gonna see that for Medicaid, we're gonna see that for Blue Cross. I think we should be able to see that for Medicare. Total expected cost of care and the actual cost of care. Just those two numbers. Can we get those for Medicare? Yes. Okay, today, sometime? Well, right here is the expected for 2020 and we'll be sure that we'll figure out what it is for 2018 and then there's... 2018 results that we're getting from. Is that a yes? Okay, great. I'll sort of clarify something. I don't, it's been a while since I looked at this, but I don't think the blueprint money actually flows through the state. I do believe that flows directly from CMS to the ACO and then out to the providers. There's an agreement between DEVA and the ACO the exact formula that they're gonna use, but the state's not in a mid-year for those funds. Other public comment? Thank you both. Thank you. So now we're gonna transition to a discussion on the 2018 ACO results. Yes. I'm looking at Abigail who has asked for just a small break so she can set things up for everybody for this discussion. Do we have some people that are on the time? We might be running a little early too, so... Yeah, I was looking at the clock, which was a big mistake. Yes, now we're off. I think I'm looking at the audience for others so I'm expecting I see one, two. I think we're missing our Medicaid folks. We are running early. I think the schedule for 2.30, I know this is gonna ruin everybody's schedule here, but I think we're gonna go into adjournment till 2.30. I know nobody likes that, but that's a bit too minute for that. Thank you. Now, I'm getting a lot of people that are saying, what if they get here early? So let's adjourn till 2.15 and then we can make a decision at that time. Not by that clock. I know. Right, it is now 1.48. Okay. So we're here to discuss the 2018 ACO results. So first we're gonna provide overview of ACO performance and we're gonna dig into the 2018 results by payer. Then the board can listen to or ask any questions and then we'll have public comment. Really fun musical chairs. So just in terms of talking about measuring performance, we wanted to do some background on making sure that we're keeping all pair model performance separate from ACO performance in today's presentation. So as a reminder, the ACO has contracts with payers. They are here to talk to you today about their performance for 2018, which is performance year one of the model. Separate from that, the state has their agreement with the all, with CMMI, the all pair model agreement that has separate performance and quality measures that we are responsible for recording. People will, myself, Sarah Libert, Elena and others will be presenting those results as they have available as we submit our final reports to the federal government again for performance year one of the model. So we just wanted to talk through some quick stuff. So again, finance and quality outcomes for both all pair model and ACO. We're here today to talk just about ACO. And a reminder that trend analysis is not available until there are comparable data at two points in time. And we also just wanted to point out that an early indicator of ACO performance could be in the reallocation of resources. And we heard some examples of that at the ACO budget presentation a couple of weeks ago. For example, the palliative care that was brought back to Porter Hospital and independent primary care will have been hiring MPs to offer psychiatric services to patients. The most fun. This slide is made by Sarah Lembert just as a review of data timing. So I just want to say this is sort of a moment in time look at how data process through the years. So this is again, just to remind you of how long it takes for these things to be reported and validated. And then for us to be able to produce results and reports on these things. This graph represents three months of run out for all of our annual reporting for the model. We use six months of run out, so three. Just another look at how data come in. So the impact of claims run out is pretty significant as you can see about a third of the claims that were incurred in 2016 were processed by the end of 2016 in comparison to the end of 2017. So while working toward pay-der, my name is describing my goal under the all-parent model, not all of the payer programs are equivalent in terms of the financial and quality requirements that are going to be within the payer contracts. So just to outline kind of high level comparison across these payer programs, I'm all too excited that the risk corridors are quite different and there may be different rules around truncation of outliers, whether or not they offer fixed perspective payments may differ across these in 2018 and whether or not they require reconciliation at your end. Furthermore, they're all quite different in terms of how attribution works. It seems like quite different. Medicaid is not about the Medicare methodology, but there's some caddy out there. Anyway, so this also say that while these programs are all talking about dollars in quality and there's movement towards alignment, you're aware, you move forward in all payer agreement. To add on to that, in terms of quality metrics, just to remind you that as we reviewed 2018 quality requirements by payer, not all measures overlap with those in the model or between payer contracts. The ACO payer agreements allow for variation in their quality metrics selected to be specific to the population that they serve. So for example, it wouldn't really make sense to include a lot of adolescent measures in the Medicare set, but it does make sense for Blue Cross and Medicaid. Really small, I know. So this is just kind of showing a crosswalk of the first column is the All Payer ACO Model Measures. So this is the agreement measures. Again, so these are just the ones that utilize claims or payer-specific surveys. The All Payer Model Measures column here does not include any of the population-based measures that utilize other methodologies. So graph this, vital statistics, they're not on this list. The second column is the Medicaid Next Generation 2018 quality measures. The third column is the Medicare Next Generation 2018 quality measures, and the final is the Blue Cross. This crosswalk is just helpful to frame up and sort of see how we're moving towards alignment. And then as the order calls, the ACO, we have the ability to design the 2019 Medicare program through the agreement. And so we'll talk about that a little bit later on. But just a reminder that these measures are 2018, they changed for 2019 payers. So to summarize, today's presentation is really about 2018 All Payer, I'm sorry, ACO Payer performance based on the contractual obligations. It is not an evaluation of the All Payer Model. As Michelle mentioned earlier, the GMCB will present on an annual basis and probably more frequently finance and quality metrics as the model progresses. Additionally, CMMI will hire, or has hired an external evaluator to assess the performance of this agreement. Before we move on, this is more than any questions about the nature of the presentations today and the broader context. We'll start, we're here in the shoes of Medicare, it cannot be with us today. So Michelle and I will walk you through what we know today. So this slide might look a little familiar. This brings down the ACO's total cost of care relative to the expected versus the actual total cost of care, acknowledging that there's this difference between the advanced shared savings and the shared savings for which the ACO is at risk. So the 5.6 million in shared savings is the net of that amount. And the 7.7 is the money that the ACO has to earn back to pay for the blueprint for health and the community health teams. For 2019 was done during the 2018 year that was a process that you may recall Pat Jones and I did and presented to the board on in conjunction with feedback from health care advocate and one care Vermont. This is just a reminder based on my last slide I just showed you that moving forward, we've used authorities and the agreement to design quality measures that more closely aligns with the agreement and other payer programs. While we're spending today talking about 2018, I just didn't want to lose sight of the fact that we have done work to really apply these things moving forward. So let's talk about 2018 following this. This slide comes directly from one care's presentation. It's available on our website and it was submitted as a supplement to their event. It's important to note here that benchmarks are moving targets based on historical performance nationally and something that we just really want to reiterate is that changing population demographics and provider networks make comparison between years very difficult if not ineffective. We want to be careful and caution folks to look at increases or decreases between annual performance results be done with extreme caution and know all of the caveats in terms of changing providers, changing participants. And we don't have data yet to do a trend analysis but as we start to achieve scale that's something we may be able to produce analytics at sort of a deeper level as we move forward. But again, this is 2018 results were based off of the Medicare share savings program. We changed the quality metrics for 2019 and moving forward so the trending between those two won't be able to exist for quite some time. This is a question that a lot of folks have so we want to make sure that we walked through how the results were framed in one care's presentation and that is that the earned score was 100% and for purposes of the participation agreement between one care and CMS year one was a reporting only year and that resulted in a 100% score. This is absolutely standard practice for all Medicare ACO models for year one. In addition, CMS regularly uses reporting only status for new measures or measures that have had significant methodological change and those results in a two year reporting only measure. So after they're introduced or any changes are introduced it is common practice for this to be reporting only. And I will just note that as of this morning the 2018 performance nationally, I know that's a comparison that the board and others who'd be interested in was not yet available on the CMS website. You can download performance results for specific ACO programs but they have not produced their fast facts on the aggregate level as of yet. So when we do eventually come back and present year over year data and kind of take into a deeper analysis we wanted to kind of like say exogenous factors that you should consider. One of those, the first thing is attribution. So as a board and buyer network you can just develop and you're gonna have different populations and then you're comparing different groups. Would you prefer that all, this is kind of an unusual thing where I have somebody here that with their hand waving but typically the board would ask questions and that. We're almost done with our Medicare portion. I think we could take a question on this section and then I just, actually I feel like we should move forward because we only have an hour right now. I just don't want to cut anyone's time off. It's up to you. Okay. So we're gonna hold public comments. Okay. We're here. This isn't it. And they just have the slide of the results up but now they move to another slide. Well, we're done talking about that. Well, do you want, is there anything more to say there Michelle? What type of detail are you looking for on the results? I guess it would be a question to the board. Well, speaking for myself, I reviewed the slide deck prior to the presentation. So I didn't act, I personally did not have any questions on these results on this particular slide myself. Okay. So then we'll go back through the, based on China's factors. Okay. So attribution. So growing provider network, pay or turn and attribution methodology, these all affect kind of who's of the population is that we're talking about. Another subject sector is related to the population demographics. As we know, we have an aging population with often greater acuity. Policy changes also affect the results, the quality and physical results. So delivery system changes, changing the pain in the system, pain in the perform waivers, might render some added noise or confusion when interpreting statistical measures. So that's the end of the Medicare section. Do you agree with what I just said and are we going to move on to the other resources? Thank you very much. I agree with what I just said, the Department of Mental Health Access. My department is the piece of AHS that is executing on the ACO agreements. So just to sort of kick off here, I'm going to say a couple of things in general first about, just Diva as an entity, you have prioritized study-based payments and it has a focus for our organization along with information technology projects, improvement in that area and performance. Overall, they all are important in areas where we feel that the state can greatly improve in its execution. We've made great progress and I think these results today demonstrate that I think it's not a waste for me to sort of do one quick high level of what we're trying to do with this model we've, as you've heard before, from many people, fee-for-service fragmentation and unpredictability of the healthcare payment system is, I think, where we focus in trying to execute and so, or the problems we're trying to solve. And so the last, I think, the unpredictability of payment you're going to see in the results today is something that will sort of speak to that and show, okay, the principles under root for operating our program are perhaps giving us some good indicators from 2018. So that unpredictability is if utilization spikes for a population, it's good for providers and it's not good for payers and the vice versa is not good for providers and good for payers and we're looking for a little bit more predictability on both instability in payment on both sides and I think that's an important footnote before we get into our results that those that are looking at 2018 current evaluation set that as your, okay, I see where they're trying to go with this and just principally for us or the principles under which we're operating our efforts are really trying to introduce risk to the provider community, maybe that was our opinions and perspective fashion and then promote alignment where possible both on the payer side, that's why we're all sitting here today but also through our relationship with the ACO assists where we can their alignment in the payer community. So there's the theory and what we're talking today about is the execution. So I just wanted to sort of set that table for Alicia starts talking about our results. Thank you. Good afternoon, Alicia Cooper, Director of Payment Reform, New University Ratesetting from the Department of Public Access. So with that introduction and before we take a look at the 2018 results we wanted to offer a reminder about our Vermont Medicaid Next Generation ACO contract term. The original contract was a one-year agreement for the 2017 calendar year and had four optional one-year extensions. Diva and OneCare have triggered one-year extensions for each the 2018 and 2019 calendar years and we're currently in the process of negotiating another one-year extension for a 2020 contract year. After 2020, the parties will have an option for one additional extension thereafter. We renegotiate our rates on an annual basis and to the extent necessary reconciliation may occur more frequently, although we've not needed to do that thus far in our program implementation. So we'll begin our review of the results by highlighting that Diva and OneCare have continued operations and used the 2018 performance period as an opportunity to make incremental programmatic improvements. By continuing operations, we at Diva feel that we've enabled another year of testing this model, which as Corey mentioned is a priority to us as a care. This also means that we across our department have continued evolution in how each of our functional units approaches their work and how that work is evolving in light of having an increasing number of our Medicaid beneficiaries in an ACM model. We've also looked at opportunities for programmatic improvements to continue helping structure the program in ways that will best enable providers to participate in delivery system reform. And one of those areas that we would highlight for 2018 is continued expansion and evolution of our waiver of prior authorization requirements. In the first year, as you may recall, we had a waiver of prior authorization that was specific to members who were privated to the ACO, providers who were participating in the ACO and services that were part of the ACO's total cost of care. We learned from working with OneCare and other providers that this didn't necessarily enable the easiest referrals inside and outside of the network of providers. And so in 2018, we expanded this waiver of prior authorization to all Vermont Medicaid providers. So it still had to be for a attributed member and for services in total cost of care. But now those referral patterns could happen a little bit more organically and wouldn't necessarily be limited to providers in the network where there might be easier rules around prior authorization because of the waiver. We see this as an opportunity to further decrease administrative burden for providers, both those providers who are participating in the ACO network and providers in the broader Vermont Medicaid network relying on their clinical expertise and decision making on care for patients rather than traditional utilization management methods. A second area of results that we'd like to highlight is that the program continues to grow. We have now had two complete years of program performance. We're in our third year of performance and we're planning for our fourth year. And across that period of time, we've seen incremental growth in the number of health service areas that are participating with OneCare's network, the number of unique Medicaid providers who have agreed to participate in this model. And as a result of that, the number of Medicaid beneficiaries who have been attributed to the ACO through our Vermont Medicaid Next Gen program. And without going into a lot of detail on the summary table, I think the thing that we would like to note at this point is that as we started the program in 2017 with 29,000 attributed members and we're now looking into a 2020 performance period we are looking at potentially 86,000 Medicaid members attributed using our traditional attribution approach for the Vermont Medicaid Next Gen program. And we're also looking at expanding our geographic attribution, which was piloted in the 2019 performance year to apply statewide, which could increase that number further beyond the 86,000. And speaking a bit about the attribution, I wanted to take a moment to highlight how we've been thinking about those opportunities for programmatic improvements over time. I think the attribution example is a nice way that illustrates how we've tried to take this year by year and look at opportunities for adjustments. In 2017, our methodology was aligned very closely with the methodology used by the Medicare Next Generation ACO program. In 2018, we made some additional refinements to our Medicare methodology, recognizing that the Medicaid population is a little bit different than a Medicare population and that there were opportunities for us to also take feedback from the provider community and from one care about primary care relationships versus specialty relationships. In 2019, we further adjusted our base methodology to look at a longer look back period. And we also began a pilot of geographic attribution in the St. John's Ferry HSA. Through this, we've done a bit of learning in this 2019 performance year, such that we're looking at a statewide expansion of the geographic approach to attribution for the 2020 performance year. Through all of these changes, what we've been trying to accomplish is making our attribution methodology more tailored for a Medicaid population and also making attribution more in line with how participating communities were thinking about accountability for the populations in their areas. And so I'm sure there will be additional opportunities to learn from the geographic attribution in the coming year and more refinements to come, but this is how we like to think about continual improvement and refinements in a year by year manner. The third result that we'll highlight today is around financial performance. Diva and OneCare agreed on the price of healthcare for the attributed population upfront. And in 2018, the ACL provided approximately $1.5 million in care above that expected price. This financial performance was within the 3% risk corridor, which means that OneCare Vermont and its member providers paid this amount back to Diva. This slide is here so that we can just do a brief refresh on the financial methodology. We agree, as I said, to a price for the care for the attributed population at the beginning of the year. And that price is also subject to a risk corridor, which as the Green Mountain Airport staff mentioned, for the Medicaid program was 3% for the 2018 performance year. So in this figure, the price is the dark bar. The blue dashed line represents 100% of that price. And then the red and green dashed bars around that represent the risk corridor. And at the end of the year, we compare the price to the actual amount of money spent on services and the total cost of care for the attributed population. And wherever that actual expenditure is, we understand it relative to where it falls either inside or outside of the risk corridor. The risk corridor serves a number of purposes. One, it serves to protect against anything that might be a catastrophic loss for the ACL or the provider community by limiting the amount of financial risk that they would have to bear. So anything that would be, in this instance, above 103% of that agreed upon price, something as some of the Diva would continue to pay. It also introduces some provider risk and a sharing of that risk with the provider community and the payer in a way that we haven't had previously in fee-for-service environment. So any expenditure that's between 100% and 103%, the ACL is taking accountability for. In 2018, the performance was in that range. And so Diva received payment back from the ACL as a result. If the performance is within the risk corridor but below 100%, because we've agreed upon that price up front, Diva will ensure that one care has received 100% of the price. Anything that's below the risk corridor is ultimately returned to Diva if that's where the performance would be. And this creates an incentive to spend that's in alignment with what we're expecting the total cost of care to be and to protect against potential rationing of care. So this next figure shows two years of performance side by side. The 2017 performance and the 2018 performance, as you can see, were both within the agreed upon risk corridor in our contracts with one care. In 2017, performance was below 100% of the price. And so Diva issued payment to one care up to that 100% mark. And in 2018, performance was above 100%. And so one care returned that difference to Diva. We think that this two years of performance within the risk corridor has been an encouraging signal about the potential of the model that, as Corey mentioned, prioritizes the use of prospective payments as an alternative to Diva service and also sharing risks with the provider community in a way that Medicaid has not done previously. Now turning to quality results. The ACO's quality score was 85% on 10 pre-selected measures of note. One care's performance exceeded the national 75th percentile on three measures, measures relating to developmental screening in the first three years of life, measures relating to 30-day follow-up after discharge from the ED for mental health and 30-day discharge from the ED for alcohol and other drug abuse dependence. More detail is available in this slide. We won't walk through that in depth, but it is available here. This table is also available in the report that is linked on the prior slide. The final result area that we'll discuss is expansion of implementation of the advanced community care coordination model to all participating communities in 2018. Of note, in this area, one care distributed approximately $2.7 million in payments to 65 community partner organizations, including primary care practices, designated mental health agencies, area agencies on aging and visiting nurse associations for engagement in the care model. There was also notable uptake in the use of care navigator and training of community care team members in care coordination skills and core competencies. So I think all of this speaks to beyond the quality of financial performance that we're seeing in this 2018 performance year progress being made in the rollout of the care model. And for our final slide, we wanted to note a couple of opportunities that we have an eye on as we move the program into another contract year. The first is reviewing and potentially modifying Dean's requirements for prior authorizations and service limitations. This goes back to the beginning of the conversation around the way of work prior authorizations. We've continued to learn from working with one care and our providers that something that would be a great benefit would be a single set of rules that apply across a payer population, perhaps ideally across multiple payer populations. But in particular, what we're looking at is restructuring some of our prior authorization requirements as Medicaid so that if a service was contemplated as part of the total cost of care, we may consider no longer requiring prior authorization for any Medicaid member, regardless of attribution status. And the whole there is that it simplifies workflows for providers that are participating in the ACMO such that they don't need to know if a member is attributed or not attributed to know if they'll be eligible for the waiver. Another opportunity that we've identified is restructuring our utilization reporting to better understand patterns over time. We have had challenges in continually changing cohort of individuals who are attributed to the ACO over the years of experience that we've had as we saw on the side of the program growing. And underneath that, we have continual changes into the group of individuals who are eligible for Medicaid. And so we have a lot of dynamism which makes it difficult for comparisons to be appropriate. And so we're looking at ways that can improve our utilization reporting going forward. Good afternoon. Kelly Lang, corporate director, has come up with Rastu Shibramant. Thank you for having me here today. Taking this opportunity to really for the first time present our experience within the ACO program, you'll see similar comparison as to the other payers. But wanted to start with a refresher of the Blue Cross vision and why we're doing this. And the first word is to gather and really have see all payer models that we're working not only in payers here at the table, but with the ACO, the providers and the community providers. And if we're going to work on healthcare reform, we have to all do this as performers. So that is really a guiding and fitting rate with the vision of Blue Cross. So jumping right in, you'll see what we're going to do. Just make sure the slides are moving, this is a little hard one. Go in a similar manner. But first we want to do a program term overview. 2018 was our first risk contract. 2018 was our first risk contract with the ACO. We did have prior shared savings experience. And this is for our qualified public land lies that are attributed to the one-care, primary care population. Primary care does include nurse practitioners and primary care physicians. Similar to a blueprint attribution. We do have the 50-50 shared risk that was explained earlier. And within the 2018 contract, we aligned our quality vendors, but also most of our contract with what Medicaid was working on. Entities stand as possible. Medicare being the next-gen model, there's some specific differences there. But really our contract is fairly aligned with the Medicaid model. Performance and quality impacting the value-based incentive fund, which you've heard about through one care, is also aligned along with the metrics that are just described. And we do actually have collaboration requirements in our contract for care coordination, analytics, gaps in care, so it's not just a simple financial and recording arrangement. There's real requirements to have engagement between us and one care. So we jump right into the quality results. Tried to take a lot of the information to see if they're similar to the Medicaid ones. There are a couple of differences due to the different population. I'm gonna walk through a little bit of this because we take the performance and how we support one care is one thing, but we dig into the quality results with one care and also within Blue Cross, comparing to the extent we can to our non-attributed Blue Cross population. So we do have lives in our QHP that are not attributed to one care. And I included 2017 right here, but I wanna caveat how much like everyone has done with their quality. 2017 was shared savings only. We had a different provider network. It was larger. We had a larger group of members, so I would not compare them in terms of a trend, but we use them in terms of indicators, which I just wanted to. And there are some positive indicators here and there's also indicators that we need to look at why there might be a drop in some scores or where we have opportunities together. So pointing out what we saw as real positives, especially when we compare to Blue Cross results, the 30-day follow-up from discharge from ED for mental health had an increase up to the 90th percentile. This was a measurable increase in comparison to the non-attributed one QHP population. Showing an effort to follow up on pre-admissions, one care in Blue Cross have also worked to develop a better way to share mental health data, which we've talked about here before, in a blinded manner, but at least getting indicators. And we're working together to see how we can make improvement to the extent we can. Another result, which we did also see measurable difference between the Blue Cross population was the adolescent well-cared visit. Looking at engagement and primary care, especially for adolescents, is really important in terms of the long-term of Vermont. And getting that engagement, we did see a measurable difference. And then actually going back up to the first one, and this is one we've had conversations with internally and externally, but a 30-day follow-up after discharge for alcohol or drug independence, there was an increase, there was improvement, but if you look at the benchmark nationally, and 19-annuals, 20% of score gets a 75th percentile. So as a nation, we're struggling on this. We're struggling within the state. So it doesn't mean because there's full points we're not still focusing on it. We're taking it off the scorecard for next year. There's plenty of room for opportunity. So I'm not going to run through all of these, but we dig down into these to the extent we can and to see where we can focus attention. And I can't go back to any questions that there may be. So as we go through looking not at end quality, but where we say clinical might also be termed utilization, I only use two examples here because this is something we're really developing with OneCare, or what are some indicators of how the population is performing, but as you do that, you have to see how the population is changing. And we actually come here in the demographics of the qualified health plan population to that which we have at Blue Cross that aren't OneCare members. And you can see there's an age difference through years is a huge, but the non-QHP population is a bit older, but the interesting part, which is when you actually look into the population, you're then looking at quality and clinical results as well, the OneCare population is a few years younger, the risk that we've identified, and this is Blue Cross, this is on the risk adjusted, this is a review we do of Blue Cross, the case weight or the risk of the population is greater at OneCare. So when you look at performance, you have to consider the population that you're working with and it has increased risk over time based on our review. So using that with the increased risk, we looked at, we just pulled through the indicators here, but very important indicators in terms of utilization and how workflow and processes that will be developed can be evaluated. So we're just the department utilization and the comparison of the Blue Cross population. So well, OneCare was at 231, the non attributed Blue Cross population for QHD was at 241, so OneCare was approved. There was an inverse of the inpatient utilization was slightly better at the Blue Cross population of 44.8, but if you look at the risk of the population, there may have been need for increased inpatient. So those are the details you need to date into one, or it could simply be they need to be managed better and that's the data points we're looking at. And then we do have a focus between us also working with BluePrint on primary care, we saw that with the adolescent well visit, but we're looking at primary care visits and this is all visits, not just the preventive, but we're starting to look at the status to determine how the impact of our joint collaboration allows for greater primary care use and appropriate use when necessary. Then getting into financial spending, I wasn't gonna put 2017 in here because of what shared savings only, but given as a first year of Medicaid year zero, which we're the only state that can have a year zero, I want to include it, because it's something we look at as a comparison mark, but I wouldn't say it's not part of our risk contract. These are targets that to remind you, they are based on the qualified health plan approved rates by the border. We did have different arrangements between 2017 and 2018, not just the risk, but we had some corridors and I wanted to point that out because we renegotiate and evaluate our contract every year and look for opportunities for improvement. So maybe we're modifying the first suggestion method. The important part is to get to an appropriate target. So what would the incentive set be to use to determine savings and or risk after that? 2020, we look at member months. The amount of members is just around 19,000 for 2018, excuse me, 2018 and that equates to those member months. In 2017, there was closer to 24,000 members. And if we look at the results, one care did have risk, so they were 101% over target, but that's a, in looking at it, we're close to target. If we were wildly off, we'd have concerns with our target, much like the corridor conversation. And so one care did pay a sat that included and you saw that in our rate filing for next year's 2-HP. We did want to demonstrate the alignment between our 2-HP approved rates and the target and that's one point you discussed before and the impact of the file rates does impact directly the ACO target. And in 2018, there was a reduction in utilization trend put into our 2-HP rates. If that was not implemented, the ACO would have had savings. So there's a direct correlation within that population between the rate setting. So taking that and looking at early indicators, one year is not a trend, so these are indicators. And looking at what's working and where we see potential impediments going forward, we are seeing positive impact. One year is showing actually some considerable impact due to engagement approaches and also impact of where we're going to look forward. We are looking at in some areas, the ACO is better than our non-blue cross attributed population for 2-HP and other areas. Blue cross is slightly better, but that's where we are seeing favorable results. At one part that is underlying all this is we have a joint teams working weekly monthly on these programs, looking into the data, but also how we get the data to be actionable. So those are some real bright spots. Impeding progress, disregard when we're judging on quality for the demographic changes, the small numbers in some of these in terms of the denominators. And that can really skew how you're reviewing the performance and then looking at the alignment of the target in this one in terms of financial performance. And while we're staying aligned with the all pair model and our work that's been going on, we're looking at condition specific metrics to implement some demonstration programs with one care and while we were keeping alignment and support with the all pair model, we're actually looking to see what other meaningful measures show patient engagement, clinical health assessments, clinical measurement and financial impact for programs. And I'm also looking to learn from others on social determinants of health and learning from others at this table of what we're doing with our metrics. And then we did wanna come back to the all pair model achievements because putting the populations together in terms of focus on quality allows or Blue Cross, the providers, the ACO to really focus on areas of need looking at a broader population. If Blue Cross was just so missing on it, it would be a small subset of the provider's panel. So we do look at achievement. This is the first year of 2018 was our first year of risk which we had not been able to move that before with the ACO, improved analytics, clinical opportunities and first time really aligning with Medicaid on our work and really leveraging the collaboration between the two. We are always looking at how we can address challenges within our patrol, so not. We have transparently had growing issues in terms of data mapping, data sharing, which we hammered out a lot together which really is felt through the collaboration between our teams and working together. Expanding our provider network has been an issue. Risk and I, from the ACO budget conversation, putting risk on the providers has limited their involvement in our program and especially I believe the alignment with the target setting and our QHP population has created our slower growth. I do know we're getting a couple more providers for next year, but it is not at the extent of Medicaid yet. We have had some complexity barriers with the fixed payment process, which is different in a commercial pair. We are looking to implement for that small number of hostels next year, so we're moving along. Don't foresee any issue with that, but that has been something that has been a difference between us and Medicaid. And then we really want to ensure going back to the risk, the sustainability of the healthcare system and that's why we've collaborated with OneCare about the incremental process of risk as we're looking to expand the programs and not just placing that within the healthcare system. And then the last one is something that I think we've all said, but can't be creating other pursuits. Success cannot be measured in one year. This is favorable therapy indicators, which is the next slide, which is refined with how we're looking at this program, but there's enough, and I think Alicia said it well, to continue to next year. We see value and especially the collaboration. And then the last one is something that we developed internally. Understanding the value of the all-paramount. Well, these are questions that we've probably asked ourselves. What's the value in it? And as we're evaluating the program on behalf of clients and members, did health improve, not just to eat a score? And that's what we're looking at, social determinants to health, patient engagement. And this is not, again, a one-year answer, but if we can have favorable indicators over time, these are some of the questions we've discussed. Thank you very much. First of all, I do apologize by not introducing everyone's beginning, but we also have serve area from OneCare. If you have any additional questions that we can have you answer directly or require OneCare's input, so. And we just wanted to add this slide, which came directly from OneCare's presentation two weeks ago of their performance across all payers. Start with questions from the board. I appreciate very much that this is a movable feast that is incremental and it's going to take a number of years for it to reach its full potential. One area that I'm wondering about, and I guess this would be a question to either and to look off the shield, is about the QHP benchmark plan. As I understand it, having talked to some folks and looked at the CMD website quite extensively, we have a benchmark plan for the QHP population that goes back to 2014, kind of predates everything that we're talking about here. And when you look at that plan, for example, and I haven't done a comprehensive deal of it, but I looked at prediabetes, for example. And in that plan, if you are prediabetic, probably the only benefit you get is a prevention visit to the doctor. And then for a specialist fee, which I think is 90 bucks, you can have the visit with a nutritionist. But there's nothing in it for fitness and everything I've read in clinical studies that people have showed me would simply say that fitness and nutrition are the two things that keep people from being diabetic, from crossing that border to becoming diabetic. And the Blue Cross Blue Shield's marketing plan, you have the annual cost for wealth here for diabetic is about 74 bucks a year. And a little over 5,000 of that comes out of the pocket of somebody who has a bronze plan. So I'm just wondering if it would seem to me given that the, at least in 2018, the QHP plans were in Blue Cross Blue Shield's arena over $300 million in premiums. It seems to me that benchmark plan might be something that could be scrubbed to make sure that it's aligned with everything we're talking about today in terms of prevention and leveraging more that the extensive amount of money that is being spent through those plans. Is there a question? I'll toss a coin. Well, what's the question? Well, do you have any plans to review the benchmark plan? I've heard people say that, oh, there are people that were around us who before me say that the initial benchmark plan was some of the food fight in terms of a whole bunch of interest trying to get themselves positioned in that plan. But I would think given the all-payer model targets that if a review of that plan that was totally only tied for prevention and not to adding benefits, but just to enhance it and align it as best as possible with the population health goals that we're all looking at, but that might be a good thing to do. So I'll start in, I'm sure Kelly will have a better answer after I approach the question this way. The idea of paying prospectively and paying providers differently is the potential of the model. What you're describing is taking the model that we've had in healthcare for many, many years, utilization management, and other health plan benefit constructs as a way to get to where we want. And I think if you, in rooms I'm in to say, are you happy with the current healthcare system? Very few people have put their hands, maybe a few surgeons, but mostly people are saying, yeah, we want to approach it differently. And so I think that the idea of putting prospectively paid with some risk motivates, I think incentivizes the provider community to look at healthcare in a different way because they have a responsibility to provide access and quality care in their communities. And what I have seen is a motivation to be able to have enough funding through a feature service model to keep their doors open. And so that motivates them to do certain things that provide margin and this is a way from that. And so I guess I'm, I wanted to say to you that I think that that is exactly what we're talking about with an all pair of model with a perspective and risk-based agreement with the provider community is that you can't really do out this out because we've been trying that for decades. I feel like to manage the benefit to provide incentives to individuals to, I think we can still do that in certain places and ways. But I just want to take that first approach to the question and say, this is, I like the question because I think it highlights what's really different about what the model is attempting to achieve. And I appreciate you also saying incrementally with patients and with an understanding that we are experiencing each of us a healthcare system that evolved over many decades without a real plan. And I think you can probably tell that but this is a little bit more of an organized approach across multiple payers and multiple providers trying to bring alignment to our system. So I know that doesn't answer or you can look at the benchmark plans again. I think that's an easier question to not say yes or no to at this moment. But I think that the question itself does offer me the opportunity to say, this is what we're talking about. It's not necessarily trying to push providers to do things as payers instead for them to be motivated to or incentivize through the flexibility of a different payment model to do things differently. So that answers the question. Corey answered it quite well. I think looking at the all-pair model that is a vehicle to test out some of these payment differences that can really support the same model we were talking about. So we got to, even without a fixed payment at a hospital, looking at total cost of care and a ladder of one care. And this provider network to allocate funds in a different way, which is what we're talking about. We're looking at who pay providers on a per diem or case rate that would encompass what would be preventive services. Those are things we're discussing in incremental nature to process some of those programs underlying the ACO program already, especially the mental health and substance use disorder programs were piloting. I do think there's always room to look at benefits and the relationship, but within this model it's actually within the care delivery system and could we find an innovative way to get people in for primary care engagement and then find some way for primary care to refer to a food delivery service? Those are things potentially within the legal parameters we need to evaluate as we look to how we redesign the delivery system and find the delivery system. I think that the all-pair model would give providers much more flexibility in the services that they provide. I think in terms of the way that the benchmark plan is now structured, it's still kind of in the old school and hopefully gradually we can move to the new school and give providers much more flexibility to how they approach a pre-diabetic. Another question I have that, and I have this question for the Diva staff last February is getting, it's noted here that premiums are an issue within this transition. And I'm trying to, when you look at this benefits clip at 400% of poverty, it's to me very striking where somebody at 399% of poverty is paying $150 per pound, a couple $150 per pound plan and somebody at 401% of poverty is paying $150 a month. And I'm just wondering what I'd like to ask again is that, and I think Diva is the best folks to do this, is can we find out what it would cost to help people above 400% of poverty to the federal definition of affordability? I think it's at 9.496%, something like that, because to me it's just one of the kind of leveling and making more equitable and fair of the playing field across a number of issues, whether it's the benchmark plan or subsidies or payer mix, that over time we've got to make these relationships as equitable as possible. Did I need to ask a question or? Okay, can you get your staff to get that done? Is that, I mean, I guess, I guess we could say yes. I mean, we were going to talk about the 2018 ACO results and so I guess we're just not prepared to talk about that right now. I mean, we're thinking about the marketplace. We have a QHT responsibility for sure, as does the Remembrance Board, that's probably why you're talking to me about it today since this is probably the first time, a long time, I've been from this board. But we are looking at marketplace as part of the AHS, the Health Care Reform Director that leads this sort of scope of work, I would call it. There was a legislative request to look into mandate, to look into the marketplace, which we are currently engaging in, and our report is on its way, I would say. So, doing specific data analysis based on a question during the ACO review that we can take it offline if you'd like. Thank you, I appreciate your comment. I just, it's good observation, we're here to talk about 2018 results, and I'm trying to look back at the 2018, 2019 benefits cliff and it exists and is well documented and just trying to put a spotlight on it. Yeah, no debate about the benefits cliff whatsoever. And I think actually if I could tie it to this conversation for a second, I would say it's exactly what you're talking about is financing the health care system. We're talking about changing what the health care system costs. How the money moves around is one conversation, how providers get paid, how much they get paid, what they're able to do with the dollars, and the flexibility they have to do things differently is really what this conversation is about. The question about, I think is both for Diva and for Blue Cross, sort of how you think about performance within the risk quarter in terms of, let's say hypothetically, we were to see an ACO be consistently under the price or consistently over the price. How would that impact your thinking as long as it was within the risk quarter? I'll take that, I'll talk to every question. So I think what we've seen so far for the two years of experience that we have is one year performance within the corridor that was under the price and one year that was above the price and I don't know if we necessarily have expectations of where the actual performance is relative to that price other than saying that the risk corridor helps keep the actual performance and the price fairly close together in terms of how the incentives are structured. I think if there were several years in a row where performance was within the risk corridor but underneath that price, I don't necessarily think from my perspective that that's problematic, I think that that's probably a signal that there were opportunities for more efficiency in spending and that the model would be starting to realize some of that. The only other way to have is that our current total cost of care is based on actuarial analysis of trends on a fee-for-service model and I think if we are about to apply soon in our chart we've had the, we call it the Arnold Palmer with the sort of lemonade and the ICT, the, I'm not sure which is which now, the ICT is the fee-for-service portion of the payment and even the more we feel like, in theory, the more we have in the prospective payment or the lemonade, the less that the corridor will really matter and it will be about getting a predictable payment system to providers where they know how much they're gonna have to provide a certain amount of care and then they can make decisions that aren't all about finding margin, to be honest, because you need margin to cover the things in a hospital setting, for example. You need margin on certain lines of business to cover the lines of business that lose money and if you're not motivated in that way by a fee-for-service model and you have prospective payment then it becomes more about how much did they actually send to provide care, we would still be tracking clinical data and what was done and then I think we would be building a year over a year and this is way off, right? We are, this model is truly an insipidcy but I think as everyone has said, so far, the early indicators are, give us the idea that we should continue to pursue and test the pilot. Just the two things I'm gonna add is the difference that we do have between our target settings, our QHT rate, approval is what guides our target so if there's savings, that's reflected in next year's rate and then hopefully we'll still have savings so if there's room in clinical system reform, if it's always risk we need to evaluate on all the rates set and the relationship between that and the target, we really do see eventually we are further and we'll take a little longer to get to that fixed payment, that could be the model and I agree with Cori and Alicia said and that allows for predictable growth but we have to get there and we have to also look at if it's gonna take more time, we are lagging in the attributed lines so I expect the performance to go up and down given hopefully our network growth which changes the demographics that I discussed so I get a bigger bit off but it is directly related and shared back within the rate file. First thing that gets positive that we see that we're living within kind of the risk orders across all three of the payer streams and the data that we're looking at really is at the aggregate level and just want to understand how are you leveraging and learnings that you're getting from either within HSA because we know HSAs could swim very much different than at the aggregate level like we could actually be, you know, taking some of the caps on the upside or downside as well as between the fixed effect of payment and the fee for service piece and how you originally allocated that on what's actually happening because, you know, one of the things we hear a lot about of course is the risk that needs to be taken on and the concern about taking that risk on a possible level so it's great to see at the aggregate where we're doing well, you know, diving down a little bit deeper into HSAs and are there learns there, any reallocations right from what you're seeing? Just to clarify for you, I guess I didn't take a lot of the total. So I think we have a couple of early observations in that arena. Certainly the size and the number of people in each program makes a difference as we've been divided among however many health service areas. But we are seeing things like, you know, in some smaller communities, one or two very expensive cases that might have been unanticipated or just randomly happened over time can really have a dramatic impact on that overall cost at that health service area level and that causes concerns when we have a bigger picture of conversations about how much risk a community or a hospital is able to take. And so that's something we're paying attention to as we look at our current strategies around how we share risk and how that gets allocated as well as what some future models might look like. We also are paying more and more attention to the utilization of services within one care's network, both how that transfers from one community to another and also what is happening outside of our network and more and more we're recognizing the need to start to dig into the services happening all together outside of our network and really start to think about what are some of the signals? What could that be telling us? Where isn't about access to services? You know, obviously there are very specialized needs that happen that somebody needs to go to New York City or Boston or even further away to have those services met and that's crucially important that they're there. And at the same time, how do we think about all of those upstream services how do we wrap around here and look for the opportunities to prevent what might be causing that need from happening in the first place? And just one other question or point I guess on what we're looking at, these quality metrics and I know when you showed the Medicare there's quite a lot of detail on where the number of quality metrics and where we stand. I think until we really have comparisons against like another year or against non-ACO ACO populations it's harder to determine how we're doing. I know I think we need multiple years because I think where Susan was going before I think you're expressing like we're not really diving into each of those. I don't think that's the case. I think the case is more until we look at multiple years it's hard to look at something and say okay we're in the 50th or the 75th but not knowing well where would we be if we didn't do this and until you have a couple of years and until you have unfortunately maybe the set that's not in the ACO versus the ACO and they will be able to compare those. So it's gonna be interesting to watch this progress and to be able to really look at the quality metrics and understand if they're moving. And I don't know if you have any comments over all about how we should think about that and where things are, where they stand or where some things might be in a lower decil versus a higher decil? Sure, thank you. All of the caveats that we've discussed previously and today are crucially important and we are limited in our ability to compare and yet that doesn't tie our hands from thinking about where there are globally opportunities where we're as a provider network not entirely satisfied with the care that's being delivered. And so as we looked at the 2018 results which we had in our hands in late September we got together and really started talking with our providers about that question and so some of the focus areas that are really coming up for us right now in partnership with our payers is our things like chronic disease management. So looking at the variability that we might be doing very well as a provider network on hypertension management for one program but not for the next and try to understand why could that possibly be? Looking at things like diabetes management and recognizing in both of those chronic conditions the way that the measures themselves are structured it's not about having a test a test is a binary, yes, no it's about is that blood pressure under control is that blood sugar under control and we know that clinically that takes months and months and a strong relationship between primary care or specialist and the patient the individual to make that happen and so we're trying to work at that foundation to really strengthen that engagement, that connection and then over time likely the next several years those are some of the things that we're hoping to see improvements in. Other questions? I just want to add to that as a pre-bump care board staff as we work to produce the annual state-run health outcomes and quality report we have the ability to look at the measures that are responsible for through the model some of those do overlap with payer programs but what we have been doing is calculating those at the level that they need to be reporting for the model but also we're able looking at those by payer program through VHIRT so as we work to sort of get some of these numbers reduced and if we're seeing high or low performance relative to the target being able to easier drill down into what program might be causing some of that difference. I'm going to try. I'm going to try. I can't hear you, Jess. I don't know. Okay, can you just slide? 16? I can't touch that. 2018 qualifiers. That's my slide. That's my slide. Sorry. No, no, no, no, no, no, no, no, no, no, no. I think the question might be then can you describe 2.4%? That means all of this. Sure. So if you take the previous slide graphic and what was submitted through the ACO and their budget submission at the top of each of those, so at the top row you've got a scoring based on benchmarks for the reporting year and the benchmark, the ACO the ACO has highlighted their score percentile they followed in and each of those score laid to a number and they're all listed at the top. So from 1.1 points up to two points, if you add those together, we'll first you have to take a look at the ones that have no benchmark or no score available and you take those out. So that's 18 points out of the total. So then you get down to 40 possible points of those 40 points. They're 32.95 if you go through and do the math and that's an 82.4% score. But again, for the purposes of the participation agreement between one care and Medicare for year one of the level, it is the 100% score for reporting. Yep. So just to clarify further, the reason or part of the reason that any ACO in its first years of reporting only is that built into the algorithm for counting those points is a year one to year two comparison and actually a quality improvement points that are normally added and that can't be calculated if you don't have two data points, which is why inherently they don't try to do that underlying calculation. Fantastic. Thank you. Thank you. Other questions? So at this point I'm going to open it up to the public for public comment. Yes, Bob. Yes. This is for Blue Cross. I see you as a partner and one of your all-payer model challenges statement was one of the challenges of aligning premium setting with the ACO expected spend target. Could you explain a little bit? I mean, I can imagine several things that would be difficult there, but what do you think is maybe the top issue under that point? We, and in the previous slide, we discussed the impact utilization of rates approved by the three Medicare boards. So we are already filing for the ball by company because of the review process and at times there's adjustments and time there is. And if there's adjustments from what we file, those adjustments they pass along to the ACO target methodology. And so if there was a reduction in utilization placed within our target, that then reduces the target by the ACO. And we have to assess the various points of alignment of our regulatory system between hospital budgets, rate setting, ACO budget, and inherently they are connected. So really aligning those processes is what that's getting at. Just to follow up, do you think adequate progress is being made along that area that the CX-30 top was? In the rate setting alignment? And in aligning the premiums, what you just said and along towards aligning premiums to your ACO spending target. I think if we have the years go on, we will find out more of this year was our first rate filing that included the ACO payment of risk back. So I do believe that shows the alignment which is the first step in including the performance. Thank you. Yes, Walter? I just have one question and two comments actually compliments. One, I speak as a patient and the prior authorization idea of illuminating that as someone who almost died from prior authorization before, that is a real plus. And I wanna compliment that. I also wanna compliment Tom on his question about financing. I think it has a lot to do with delivery systems whether you like it or not, it does. There's so many reminders are priced out of the system now with deductibles, co-pay, all the rest of it. Can't get insurance or forced off of eligibility and so on and so forth. The next one is delivery system reforms. And I have this mostly as a patient outside of the complexity of all the all fair model is that when we talk delivery system reform, that means that a patient get more than 10 minutes with a doctor and that's if you're doing good or with a primary care provider. And that's the question. When you go into primary care provider, if you get 10 minutes, you're doing good. And so I was thinking that as a patient going into a doctor's office. You wanna try that one, Sarah? Sure, so I think the intention under these value-based payments is to provide more flexibility for patients and providers together to identify what the needs are for that unique individual and to work in a better, more informed way around how to meet those needs. And that can be one-on-one between provider and patient or it could be extending that care team which is a lot of the work that we do. And so that might be into other providers in the office or into the community. But that flexibility around not having to count 10 or 15 minute visits in the same way because that's tied to one specific fee is a very important underlying principle in these value-based payments and in the care delivery transformation that we're all working towards. We're not there yet, but I think we're starting to master our guess. I know. I think it was a great question, Robert, especially after the story that was in the press that we did yesterday about people being asked, not to ask other questions with providers. And it's just so far going in the wrong direction that my understanding is that has been addressed in a way that we'll be straight away, but I'm assuming, Sarah, that there is absolutely no incentive for anybody to limit the conversation between a patient and a provider. Absolutely not. It's really about that relationship and identifying the needs and the plan to support those things. My bigger fear, Walter, is that we don't start doing something differently with higher education. We're not going to have enough providers. It's a national problem. That's my biggest fear. I take a look at just calling to schedule a colonoscopy enrollment. It needs to go a long way. I know that one doctor is retiring. They've been recruiting for a while. It's not easy to find doctors and those are the type of things that really scare me. I agree and at the practice I've been going to, I've been through four or five primary care providers and every single one of them, if A, E retired or B's, gone underneath the umbrella of a hospital. It was so nice of everybody had a provider that was with them for a long period of time so they actually knew the patient. And I've lost providers because of network troubles. One insurer takes over from another and kicked that everybody out of the network. I've gone through all kinds of mess with providers. So I'm with you on that one. It's a problem, yeah. Other members of the public. Yes, over here. Get a few. I'm sorry about that. Thank you, Mr. Chairman. My name's Patrick Flood. Since the one care budget was presented about three weeks ago, I've been trying to better understand the Medicare number and I don't want this to turn into a big long discussion so the media better way get the answers and I apologize for being dense because probably everybody else in this room understands it. When I go to the last PowerPoint slide that, of course, I'll just show the Medicare one. Yes. The first one, I have several questions. I don't expect them all to be answered in the next two minutes but the first one is, is it 17 million or is it 13 million? Because when I do the math on those two columns, it's not 13 million. And so I don't know what the correct number is, number one. And my medicines up in my buses are always, is that not, say, 379 or 339 or 322? For sequestration, oh, sorry, right. For what? Sequestration at the federal level. So there was a difference and there was an adjustment but we can document that more clearly. Well, that would be good. The other thing was, the second question is when the budget was originally presented, I was very surprised to see in the spreadsheet that was included that one carer was claiming 13 plus million dollars in savings in Medicare. I thought that was a pretty striking result. Today I see, well, that's not exactly right. Or maybe it is, we were just describing it differently. The line on the bottom says 7.7 million in advanced share savings and 5.6 in shared savings. So I have a couple of questions. First of all, I've never heard the term advanced share savings. I don't know what that means. I thought I heard somebody up there say, basically we have to earn it in the future. Maybe I've just heard. Number one, number two is, is this again, and so that 13 million dollars, is that actual savings against the estimated total cost of care? Because I can't, so there's a number somewhere that I've not been able to find that identifies the total, the estimated total cost of care for 18. I see a number for contracting total cost of care. That's the same thing. Yeah, so that should be the expected, it could also be called the expected total cost of care. To answer your earlier question of the advanced share savings is a term that's defined in all of your model agreement. And so this is just a cash flow mechanism making sure that the blueprint money is available before the end of the year, which is when the true up occurs that actually allows the shared savings to be. Okay, so the question is, did one care actually reduce the spend in Medicare by 13 million dollars? And now 7.7 of that is being allocated to the next year. No, it has been spent in that year to achieve those savings. It was spent in 2018. Yes, so it's a cash flow mechanism is an advance through one care to pay for the blueprint and that those monies are at risk for those dollars. So they have to achieve the 13.3 or else they have to come up with a 7.7 for the money. They already spent to make those investments. Okay, I think, okay, so spend 2018 in advance or in anticipation of saving that money. In 2018. Yes. Okay, and so the final one I'm gonna sit down. So you just answered my other question, which is the Medicare spend goes anticipated to be X and one care came in 13.3 million dollars under that in terms of spending for its Medicare patients, correct? Yes. Okay, so the next question, which we do not have time for today was, I would think everybody would be extremely interested in how you did that. What care changes happened to earn 13.8 dollars in saving the money that you have? I agree with you, it's a longer conversation. I think for us, it's a investment in multiple strategies, things like primary care, investing in partnerships with our continual care partners to our complex care coordination program, looking at quality improvement opportunities and the lining of the blueprint. We've discussed some of those things in our written submission for our 2020 budget and in our testimony, but we'd be happy to provide additional details as well. And who would I contact to get that additional detail? You could contact me and I'll push you in touch with the right people. Okay, so just, I really do need to shut up. I don't want to keep calling, but what I will be asking for is, I understand you've made various investments, but the question is, where did the saving tax occur in hospitalization, in ERs, in doctors' offices? Where was the spend less and let's move on. So commissioner, welcome back. We're worried that you've walked off into the sunset. So it's a great question. It's good to see you. Just to follow up, I understood that the 7.7 million was for blueprint, the SASH and training health teams and that it was a pass-through from the state of Vermont, federal and state money. And so it's confusing that it's being described as savings. Right, and it does sound confusing, but I think the caveat is what I mentioned earlier is that one care is still on the hook to earn those savings. That's why it's called advanced shared savings. So. Hey, can you just identify yourself for the record? Oh, my name's Julie Losserman. Oh, exactly. Yeah, so did that answer your question? Maybe so the, it's not just a pass-through. One care is still responsible for earning those shared savings. Otherwise they have to come up with another funding mechanism for the blueprint and the community health teams in SASH. Since it is an investment platform. Again, Julie Losserman, the advanced shared savings term was a reference in three-round care board table on total savings and loss in the results for 2018. And it only seemed to refer to budget in 2019. So is advanced shared savings part of 2018 as well as 2019? Yes, the shared savings amount that is, so the advanced shared savings term comes from the all-paramont agreement between state and CMS. That dollar amount gets trending forward to pay for the blueprint and SASH and CHT as part of the model. So without that agreement and without that pass-through through one care, which distributes the dollars through more than just their network providers, it's for the entirety of the blueprint. So non-very providers and they see a network. It's the only funding mechanism for those programs. Right here. Right here. Can I just identify something that the concept is in the all-paramont agreement? I don't think the all-paramont agreement actually has the term advanced shared savings. The idea is that Medicare was gonna put this money towards these programs, continue the funding that was previously through a different demonstration program. The only way to fund that is through the benchmark. So that money is included in the benchmark, paid out in advance with the expectation that the ACO is going to, you know, its target has been increased. It's going to have to live within that or if it doesn't, then, you know, it's on the hook. What is that? You might wanna, it is unclear. Because even when Sarah prior to this was showing how the number was built up, you know, she specifically showed, you know, the number being built up and then adding in another $8.8 million into the total. And so we built up the base number for not the ESRD number and then we added the $8.8 million in. And so I think the confusion is either, is that $8.8 million on top of what we would get normally and then it passes through OneCare and then OneCare utilizes that $8.8 million in different ways. But if that is the case, then it's not really, we named name in a shared savings, but it's not really savings that were generated from the program versus if the other case would be in the base number that we got in the ESRD number that it would have been higher and it's been reduced by $8 million. It's in that base. And then it's particular facts in that base is a non-claims amount. So that's why it gets trended forward and then removed against or recognized as something. So I think it's unclear the way and then the number was still as long when we did that for a year. So I can see, did that happen? Well, in some cases, frankly, without the out-of-care model agreement, that number would not exist because if you were doing Medicare next generation, there would be no money for blueprint or a sash advanced or otherwise. The other public comment. Yes, on the way back. H-12, I'll leave it at H-12. Yep. Aligning the premium setting with ACO aspected spend target. Give me a second. I was hearing an echo. That's really distracting when you're hearing an echo in the hearing. I'm curious to know when it comes to the premium setting, being aligned to the ACO aspected spend target, how medical inflation and inflation in general fits into that, especially when you have an inflation rate that might actually, theoretically, be above your savings possible. And so, I've kind of always had this question in the back of my mind. When you start asking for your rates, are you including the medical inflation if one caretaker is back and says because of medical inflation alone, we're setting this as, this is gonna be our rates in the new year. How do you balance that all out? And at the same time, I don't want you to balance it out too much because I want to have that visibility, transparency about did we actually save something or did we just end up paying it in the premiums in terms of how you did your risk assessment even? What if risk changed? Risk can be a form of inflation. So, good question in terms of the relationship to the premium and I'm not our actuaries who spend a lot of time here. So, with the caveat of what our buildup of rate is, obviously includes utilization, trend, inflation, a number of components, risk of the population, demographic, but the difference in how you describe it is not the ACOs setting their target and going to Blue Cross pay us X. What we have is the relationship where we set our target based on that premium. So, the premium is set first for the overall rate and then we work with one care to make sure it is reflecting the appropriate rate they validate and work with us. So, it's not setting a higher rate than what is actually in the approved premiums. Any other members of the panel? Yes, Susan. Susan Ahramoff and Ramon Valdez, I know this is Odeys Council and first of all, Mr. Chair and other members and staff at the Green Mountain Care Board, I want to sincerely thank you for holding this hearing today. I think that it's really important that the Green Mountain Care Board in its regulatory role make some efforts to find out if the alt-hair model is actually either improving quality or reducing costs or reducing the rate which costs for Odeys. And I think that holding a hearing like this and collecting this information is one way that you can do that and that will help the members of the public that might self-trust that you're a regulator that's truly interested in the quality of Ramon healthcare and whether or not this model is working. I have one general question and then one specific question. So, my general question, I can either direct it to you, to Deva or directly to Deva. In the past, and Deva's been collecting data on ACOs since at least 2014. So there have been quality and performance measures for one care for other accountable care organizations and shared saving programs and other programs. Deva's gotten really good at doing this. Deva used to have a course on what the quality results were as you guys mentioned for people in the model and people out of the model so that you could look and see whether or not the people attributed were having better or worse results, greater or lower utilization of the ER. Whatever the measures were, there was a separate set of numbers for the non-attributed lives. So I'm wondering if Deva's still doing that and if so, if that information could be made public as well. So since it's a question for Deva, I'm going to defer the commission request. Well, thank you, because I'm a bit afraid of that. Alicia's been working on shared savings and in our shared savings program since I think the beginning, to be honest. Thank you for that question, Sue. We do plan on putting together a summary of the quality results that will allow for a comparison of 2017 and 2018 within the ACO program. And we can also add to that the results of the full Medicaid population. I would say the limitation currently is that we don't have the ability with our vendor at present to do a comparison of the attributed population and the non-attributed population separately. But we can look at the attributed population as a subset of the whole population. I believe we did have a summary that was available for 2017 with that comparison, and we are planning to update that with another year. And I think you mentioned Polly, that you'd say her financial performance. So we don't quite look at financial performance in an apples-to-apples way because we're not setting an agreed-upon price for the non-attributed comparison population. I think we could probably spend some time thinking about what a helpful comparison might look like though, because I do recognize that that is a better way to understand this in the broader context of the Medicaid program. That would be great. So if I follow up with you separately. So my specific question is on slide 29. It is the Medicaid results. And while you're getting there, I just want to say one thing about the Medicare result which has a similar chart is I really think it is a move, all of us, to look at the quality of performance of both in Medicaid, Medicare, and in your process. Medicare results, yes, they're getting 100% this year because it's a recording year given to new ACOs or ACOs in your programs. It's kind of ironic because one here is one of the oldest ACOs in the country. But yes, they're getting 100% as a result of our program design. But you can look beneath that and see what their actual quality scores were. And I would encourage you, especially the board members and the staff, to look at the quality of performance, especially the numbers for things like ACOs readmission. To review the important number, it's not going in the right direction. Our Medicare quality results last year were about 90% lower than they had been the previous year. So if our Medicare quality results are continuing to go down or to go down on significant measures like ACOs readmission, I think that is something we should all be paying and attention to as we consider whether or not this is the right model, the right direction to be moving in. In terms of a specific result of this Medicaid chart, last year, this was a new year of Medicaid, next gen, did the same thing and gave one care full credit for recording only. This year, that's not the case to separate a couple where there were F, F mark, CERV, whatever as NAs were there. The number that I want to call your attention to is the third one up from the bottom, initiation of alcohol and other drug-dependent treatment. You'll see that we're barely, barely eat into the 22 percentile. The 25th percentile would be 38.62, we've had a 38.87. That's a pretty low score. It's better than last year. Last year they were zero, they were below the 25th percentile. This year they're in the 25th percentile, they got a point. My understanding is, is that Viva had some kind of quality performance improvement project set up with one care to address the results of last year's quality performance including this drug initiation measure. It looks like there's been some impact and it did move into the 25th percentile, but I'm wondering whether or not you have any plans to address quality performance based on the 2018 results. I'll start and then Sarah you might have some additional comments. I think there has been work at Viva historically and performance improvement projects related to this measure. I'd also note that this is a tricky measure because the definition that we have been using in the first several years of the program relies on claims data exclusively and some of our state data systems where information about some of these encounters lives has not always shown up in claims data. So something that Viva has done in the past several years is looking at ways that we can draw on those other data sources to get a more accurate picture of performance relative to this measure. And it's something that we've discussed doing in the future is modifying the approach that we use for reporting on performance to include those additional data sources. I think the other thing that would be important to note is the year over year comparison that you mentioned seeing that there was some improvement from 2017 to 2018 and one of the challenges in looking at that year over year comparison is that we had a very different population in 2017 with the 29,000 approximate attributed members going into 2018 with significant anymore numbers. And so with the change in population I think it's a little bit difficult as some of the other panels have mentioned to make appropriate comparisons year over year. And I think I would defer to Sarah in terms of any of the quality improvement activities planned. So I do want to note that one of the challenges that providers in one business network continue to face is that with a suite of very important quality measures around both mental health and substance use disorder we are not able to receive the detail of the claims information which means that on a month to month or quarterly basis our providers in general don't have that information. They don't have signals of whether performance is getting better or worse or staying the same. And so we've been looking at partnering first with Blue Cross the Shield and now looking to partner more with Diva around some alternatives to be able to get de-identified but still data that is timely. It's not nine months after the end of a performance year before we find out how we were doing as a provider network. And as we're able to do that with our example in partnering with Blue Cross the Shield we're actually sending that aggregate de-identified data into our provider network and starting to talk to them about how do we use that, where are the opportunities. And we're really excited about that partnership and looking at what's standard with Diva in your account. And so sometimes it's getting creative and thinking about different ways how to crack that tough nut and that's one of the examples of what we're working on. Okay, and the numbers are involved. Speak one clarification for them. Sure. So soon you referenced the risk standardize all condition rate admission measure. For my opinion actually the performance on that though I would recommend comparing to last year's results actually increased submissure is inverted so a lower score is a higher percentile. So they actually did perform higher if you are looking compared to 2017. I'm not gonna open up and hear you again. I'd have been it if it wasn't like that. It's still not a very good number I don't think. 40% They're in the 70th percentile for 2018. Go ahead. I'm in the face. I was just gonna thank the panel because I think that that's it for a public comment. I think this is really good to shed some transparency on what we see a lot as far as the reporting but public doesn't get a chance to see the work that's being done. So this is really good. And I wanna thank Susan Aronoff for really putting the public in our ear to get this out in public where it can be filmed and the public will have an opportunity to see exactly what's happening. So thank you to all the panelists. I know that we're doing a really good work with that. We're going to let you go and continue with the board meeting. Is there any more business to come before the board? Seeing none is there any new business to come before the board? Seeing none is there a motion to adjourn? No. Second. It's been moved and seconded to adjourn. All those in favor, 6-5, I say aye. Aye. Thank you, everyone. Have a great rest of your day.