 Great. Well, thank you very much. This session was initially to be co-moderated by myself and Katrina Goddard. Katrina was, at the last minute, not able to travel, although I understand she is listening in, so she'll be keeping track of me. So this is going to be a presentation and discussion, Harmonization of Outcomes and Measures, Selection and Performance. There's going to be actually three brief presentations. We're going to start with Jessica Hunter, who's going to be presenting from Caesar. And then Alana Raman-Ingward at home are going to present two different aspects of the merge around some lessons learned for harmonization. So Jessica, I'll have you come up for the first presentation. So I'm Jessica Hunter. I work at Kaiser Permanente with Katrina. Unfortunately, like Mark said, she wasn't able to make it, so I'm going to present her slides, and I'm sure she's listening in on me as well. So for Caesar, we're just getting started. We've harmonized the measures and recruitment has started, and so we didn't have any data to present. So mostly what we wanted to focus on was kind of lessons learned and challenges that we've faced along the process. And I think we can all relate to this slide. We got there, but it wasn't quite the path that we anticipated to take, but we did get there in harmonizing our measures. Some guiding principles that we tried to use along the way, of course, using existing and validated measures. If they weren't validated, were there existing measures to use? We wanted to minimize change to any existing measures. Some people made debate on whether we kept it short or not for the harmonized measures. And then, of course, different sites had different plans for administrating surveys, and so we needed to be flexible in doing those surveys. One thing we had to take into account was the site-specific differences. There were differences in population. There were differences in how people were defining declineers. We ended up landing on defining a decline or someone who declined sequencing rather than other study activities. How people defined, how studies defined a provider. We ended up landing on who was responsible for the downstream care following return of results. There were differences in the setting. There were differences in the approach. Both, I don't think any of us were planning on giving the organizational readiness to change survey. Not everyone was planning on giving a provider survey, and at least one site wasn't actually planning on giving any surveys. So that was one challenge that we had to face. Some of us have access to EMR data, other sites don't, so we had to account for that, as well as differences in starting of recruitment time. So when work groups suggested harmonized measures, there were ideal times to give these measures. But where we ended up landing in reality is that some sites weren't able to meet these ideal time frames. Some of us changed our time frame to give surveys to account for what was requested in giving the surveys. But of course, you know, reality is not always ideal and when the timing works out. Some other challenges with the harmonization process were iteration and version control. This was a really big problem early on, particularly using email for documents and communication. It was very hard to control for version control using email rather than some centralized repository. Lapsons and communication between meetings, there was a lot of time between meetings. There were often communication among subgroups, among the work groups where decisions were made, and that led to a lot of miscommunication about surveys. A lot of work needed to be redone because some sites were ready to start recruitment before the surveys were actually ready. And so then when they were ready, there was reprogramming, there was kind of a refresh and restart that had to happen that led to some duplicate work. And then obtaining feedback and pushback. I know some sites didn't always feel like they were getting heard, and we weren't always able to accommodate all concerns about the harmonized measures. Other challenges were adaptation. When we did have existing and validated surveys, sometimes we wanted to adapt them, and they're often very high literacy and they're not always culturally sensitive. And so given the focus of CSER2 and diversity, we really needed to make these accommodations, and there wasn't necessarily time to do that. Harmonizing sensitive questions across sites. Design challenges, particularly when incorporating harmonized measures into the site-specific measures. Sometimes the response options were the opposite, and so having to go through each survey at each individual site and make sure that the survey as a whole was cohesive and easy to follow for the participant. And then work group challenges. A lot of the work groups kind of worked in silo and developing harmonized measures, so there were some competing questions and overlapping measures that we had to account for. In the end, we are mostly harmonized, but to kind of give an example, we weren't even able to harmonize for sex, age, and zip code. So this kind of gives an example of some of the challenges with being able to harmonize. But that's okay, because at the end of the day, a lot of times it absolutely made sense that you can harmonize given the site-specific goals of the individual projects. And even though we've developed our harmonized measures and circulated them, we're still facing problems going forward or concerns going forward. One, combining data across disparate sites, given our populations and our context, can be quite different from site to site. And so being able to combine this data from the different populations and contexts is going to be a challenge. We have overlaps and concepts or projects across work groups that we are trying to start working through in these conversations started during the meeting yesterday and how do we handle kind of competing ideas across work groups, specifically for sites that weren't actually planning to give certain surveys or give any surveys at all. The need to provide supporting documentation for the IRB, particularly around analysis plans of giving these surveys, as well as just collecting data, cleaning, and redistributing as we started to discuss yesterday. Katrina wanted to take the opportunity to remind everyone that Kaiser is leading a supplement on validating a subset of the harmonized measures. So working to develop a panel to inform which measures are going to be done. So we're asking for at least one person per site to join this panel. And if you're interested, you can talk to... Katrina isn't here, but Chris and Sarah are here, or reach out to Frank by February 1st. So some lessons learned to share are one to plan ahead and plan for and build in this time upfront for work, particularly for the sites that weren't giving certain surveys or weren't planning on giving surveys. It was a much different issue to give a survey that you weren't planning on giving than to just add items to a survey that you were already planning on giving. So to perhaps build in specific plans into the RFA or allow for a budgeting step when these plans were clear up front. Assure management of materials, so to have a centralized role for managing and hosting these documents to account for version control. And allow time for development and validation of the measures. That was a really big challenge that we faced. And of course to facilitate cross-site communication to align these harmonized measures with the specific hypotheses. And so this was from the perspective of the surveys, measures and outcomes working group. But of course all the working groups contributed to the harmonized measures. So certainly thanks to everyone, particularly thanks to Frank who was a coordinator and Katrina and Kelly East who were the co-chairs and then now Christine Rini who's the new co-chair with Katrina. Great, thank you Jessica. So what we're going to do is we're going to go through the three presentations and then we should have about a half an hour for discussion. So as questions come up keep track of those and when we have open discussion just plan to move to the microphones and we'll have the discussions.