 Hi folks. Thanks for joining. We'll be starting in a few minutes. We're just waiting for people to join in. Alright, we'll start as we wait for a few folks to join in. Welcome to the Adoption Series, Webinar number two. This is a series that is done by the ARC Consortium with collaboration of fumes. And we'll have some other exciting webinars coming in the near future. We'll talk a little bit about that in a little bit. But before that, actually, then can you go to the next slide please. We do have a few words about the ARC Consortium, the mission, the methods and division. So the ARC Consortium has a mission to support the AR community and the AR Foundation and help to develop the infrastructure required to ensure the long term stability and growth of the AR ecosystem. And we do that by financial support to infrastructure projects and also by working groups to enable industry collaboration. There are several working groups currently in activity, our ladies, our validation hub, and we are also this can be said as a working group to the adoption series, which is discussions of how we can adopt our in the communities, particularly in pharma. The vision is for an AR ecosystem advancing 21st century competition statistics and data science. Next slide please then. So the scope of the adoption series is aimed at those leading the our adoption initiatives in the community, open to everyone. It's focused on on how to do things, how to help the community to adopt our in their practice day to day practice. And the typical format is a presentation and a focus discussion, which can be a panel which we're going to have today or break up session what we have last time when GSK was I discussed next bullet future topics that we'll be discussing. For the webinars, the are packages and package development using our and sauce together and defining your standards and there will be many, many more coming. The plan is to have one of those webinars every two months. So every other month, we're going to be having one of the discussions happen. And next slide please then. So today's presentation is has two parts, it's a presentation so we have this introduction a presentation, which today is going to be our training strategies at Jensen. We have a quick Q&A for about 10 minutes, we may be able to squeeze a little break there before the panel. And then the second part is a panel discussion with leaders from Pfizer GSK Rosh Jensen and Merck. Next slide please then. So, great. So we start with the first part which is the presentation. And it's entitled our training strategies at Jensen. My name is Paulo Bargo, maybe you should have started with that. I am from Johnson and Johnson, I was at the answer now I just I just moved to a new role. I was developing a lot of these training strategies in Jensen when I was there, I just moved about two months ago. Myself and Daniel and Gayatri will be going over the topic of our strategies and what we did to enhance the usage and the knowledge of our in our community. Next slide please then. So our agenda we are going to our mission that we had for this strategy, the training strategy, what are typical challenges that we are facing some considerations when we are developing these strategies. We're going to go into three specific solutions that we apply that we use in training our, our workforce. One is based on crowdsourcing. One is a more traditional classroom but leveraging modern technology, and then some discussion about the learning approaches. We will talk about practicing reinforcement support and change management, because training is not only about having people to learn and it's not just about SAS programmers becoming our programmers. There's many other things that are encompassing this change of mentality that has to happen. We'll talk about some key takeaways and then we go to the Q&A. We will accept questions through the chat. So please send your questions in chat. We will wait until the end of the presentation to answer them. So, so Mesh is going to be giving us these questions by the end. So feel free to send questions as we are. As we are going through the presentation, anything that goes in your mind and about how we implemented this and I mean we want to really have a discussion on on the how to and any questions you may have that it's not clear on how we implemented some of these. Please feel free to send us this question so that we can answer in the end. Next slide please then. So start with the mission and we purposely put this slide and actually we're discussing about this, like we about this being goals and we actually discussed thought about it that this has to be more of a mission. You know, training is a mission. It's not just a goal. It's not just about converting SAS programmers into our programmers. It's really about a big environment that you have to change a mentality that you have to change. And because of that becomes sort of mission. So our mission, when we started all of this is was to provide analytical solutions that enable early and effective data driven decisions to bring novel therapies to the market as quickly and efficiently as possible. So this means that we are bringing our so that we can increase the analytical statistical programming expertise that we can focus on disease and therapeutic area expertise that we can enhance the capability of the people that are working in this area. That we bring knowledge of the industry trends and regulations into our community using our that we work with submission planning and deliverables, which is the core of our work, but also doing a civilization and implementation that is reproducible. And to do that, we have to somewhat change the way everyone is working in this new new environment. Next question next slide please then. So, there are many challenges and broadly speaking, we put some of them here because it's there. There are some keys interesting when you're trying to change the mentality of your community. So, as we're doing training, we always have to keep in mind that there is a need to continue our day to day job. So it's always difficult to set aside time to do that training. People have to do submissions. A lot of people are still using SAS and will continue to use SAS as a programming language. We have other trainings that you have to have in mind that you have to take, you know, take time to do it that the regulatory agencies, our companies require us to do. All of these are important to consider as we are doing your strategy. There is a wide spectrum of both experience and desire on learning are so some people have learned are in their college or in a prior job. Some people have been doing SAS programming for decades and no experiencing are. There is a lot of changes that are happening and people get saturated with these changes. There is a limit pool of subject matter experts that you can rely on to help on on this mission. There are skills. Maybe great, but maybe these folks, they don't have teaching skills, which is also necessary as we're trying to implement is training strategy. We also have to think about the infrastructure that are going to be using in order to make all of this happen. And very importantly, this is a transformational change. And it's not about skill set change, you have to change your mindset. And there is change in your personal identity, you really need new tools, you need new environment, you need new new workforce. And all of these are part of how you elaborate your training process. Next slide please then. Some considerations as we are thinking about this and as folks should be thinking as we're developing these strategies. One is to think about the audience. What skills do they need? Are they going to be doing submissions? Are they going to do exploratory work? What skills do they have? Are they eager to learn? Is this really something that people are, oh, I don't know, I have to do things in a different way. Do you have that change management perspective as we're talking about these strategies? Do you have trainers or subject matter experts that you can rely on? Do they know somewhere? Do they don't? How do we equalize them? Do you have people that are experienced trainers? How are we going to support them after all the training is done? Because training is not just about going there and do some training. You have to apply it and you have to be doing it over and over and support. Otherwise, they don't take root so that you can actually use it. You have to think about the environment where you're going to do this. I'll be discussing one of my examples about our studio cloud, but many folks may want to have a specific training environment where it's more compatible to the future day to day work that you're going to be doing. So all considerations that you have to have in mind as you're defining your strategy for training. Next slide, please, then. So as I said, we'll be talking about three different approaches that we have implemented here in Jensen. There are different times of learning styles. There are different times of learning needs. Not everyone learned the same way. And because of that, we have implemented these different strategies. So the first topic is about our crowd sourcing approach. Then Dan is going to talk about a more traditional classroom with the modern technology approach and also about learning as an alternative to training strategy. Next slide, please, then. So the first topic is about graphics with our crowd source training initiative. And it's something that we implemented about two years ago. We were able to train a great number of people using this crowd sourcing idea. And we actually have done a few other programs to learn other different specific skills over the years, the past two years, in different topics to do it in a way that everybody can learn things that are actually applicable on a day to day basis. And that's really what we are trying to get with this crowd sourcing mentality. Next slide, please, then. So first, I'm framing a little bit of the problem. Next slide. So there is, of course, there is an increase in using R for drug submission. And that's the base framework for all of this training. So in this specific problem, the audience was a mix of clinical statistician and statistical programming. We had a large cohort of people that we want to train was about 150, 170 people. But which poses a question, I mean, how do you take time? You cannot just take these 170 people stop all the work that you're doing that you're doing for submissions and everything else, and just put them in a classroom for a week and then do the training. So that's why we went with this crowd sourcing framework. It's a very heterogeneous, our proficiency cohort. Next slide. Actually, we did a quick survey with the group. We have a lot of non-clinical and clinical statisticians. And in the non-clinical space where things are more exploratory, there's a lot of folks that are actually our experts or our users. The non-clinical space is in the other way around. It's about only a third of the population, really new use R. And so it's very interesting and very, how do you create a program that actually cater to all these needs, right? It becomes a very interesting challenge. Next slide, please. So some of the challenges, it's difficult to set aside time. Retaining the knowledge is a problem. You know, we don't want to have 150 people go have a training and then come back and not use this for another two, three months, and then they lose and they forget all that they learned. And we actually had to think about using a customized curriculum because we wanted people to use these skills immediately and have the capability to do that as they are done with, or as they are going through the program actually. Next slide, please. So a little bit about the strategy of crowdsource training. It has four, I'm going to be talking in the framework in these four different buckets. It's who is the crowd, what is, what to crowdsource, how to crowdsource, and what's the incentive to do this. Next slide, please, then. So I'll start with who is the crowd. I mean, yes, you have 150 people, 200 people that you need to train, but the crowd is actually, and those are part of the crowd. In a way, because they are, it's a very interactive network in the end of how you do this. But the crowd is really the group of SMEs, subject matter experts that you can rely on, which have some knowledge and but have extensive domain expertise and a desire to teach and learn. And actually from, for our case, we had about 30 individuals that were doing this. Why is that the case? Because these 30 individuals are going to be working with the larger group of people as mentors and coach. And there is a lot of reverse mentoring as they're doing this. A lot of the people that actually have these skills are younger people that are training older people that have been doing SAS for a long time. But they're also doing some training for my formal training kind of roles where they are actually creating material that has actually been distributed in a more formal, you know, webinars and videos and things like that. Next slide, please, then. What are we crowdsourcing? So the crowdsource aspects of this is actually the creation of the curriculum, the creation of the program. So what we did was we use this crowd, this 30 or so our experts to help us create what were their assignments, creating a library of graphics and things that they need on a user on a daily basis. So it's Kaplan Meyer, Forest Plus. What do they need to do? How do they do it today with SAS and other capabilities? And how would we do that in R so that we can train other folks to do it in R? We start very simple. We're focusing on graphics and using mostly GG plot. It changed very fast as we're doing this. Data actually that we use is mostly simulated data and anonymized data because we are using, for this program we were using our studio cloud as a platform and it's public so we didn't want to have our data out there. We're actually crowdsourcing leadership roles. So we're trying to make this mentorship or this network of people connecting with each other through knowledge, which is really important part of the crowdsourcing experience. And they're also creating this advanced content, which you call tips and tricks, which is material that they can actually promote and give to other folks through different formats. Next slide, please. So how do the crowdsourced? So we divide this large number of people in groups of about 20 to 30 people. We have six groups. They were separated by therapeutic areas and which sort of coincide with region. So we actually have a group in China, which is independent, not therapeutic area because divided because it was about 20 people. And because of colocation, we made sense to that way. We assign experts to every single group about two, three experts from our 30 experts that we had. And then we use other experts to help with as assistants and so it shows a little bit in the roles in here. We have the leads myself and another two people were leading this initiative. We're not creating all the work which will be a nightmare for us to stop our jobs to really do this for have to create all of these so we are relying on the crowd to create all of this. The crowd is also helping as group leaders as they're helping in this group. And we also have these assistants that are helping throughout asking questions in this mentor, mentor relationship with the people that are learning. We designed the program with repetitions. We started with at zero time zero we give an ax on assignment and exercise. We let them work on it for about a week. They regrouped in their group near 20 to 30 peers group led by the experts to see where she stops what's going on, help everybody bring everybody up to speed if they haven't been able to do anything. And then in the second week, they would come back and present what was done for that assignment. And then we did that several times, changing subjects as we went along. Next one, please. Then we use several resources which are important as I mentioned we use ice to do cloud to which was really helpful for us, because we didn't have to worry so much about people, how to install our how to put the packages on all that stuff. It's all taken care by our studio cloud. We use a center repository for sharing communication we use teams, just because it's available in here but other people other systems would be helpful to slack or other systems that may be available in our community. And we created documents resource documents and we, you know, videos and things that can help people get up to speed or just get the kickstart in some of the concepts books that were available we give all that information to the to the cohort that was being trained. Next one then. Just as an as an example, the first assignment was very simple do an x y plot of the progesterone receptor versus the estrogen receptor. Very simple. We asked people to add color and shape based on two more grade and and the menopausal status, and then we did ask them to do a pair wise plot. And that was what they were supposed to do we didn't, we didn't give them the end result, we said, we didn't say, do this, I want this, you to create this graph we said, This is the problem. Figure it out. And so they went learn, taught themselves how to do it. Some basic information some base resources basic resources for them to do it went on and did work on it. And came up with the. This is some of a standard example of what happened as a response, these graphs. So they were able to do it within this span of two weeks with the help with their, their peers so they would go and talk to their peers how did you do it how can I help I mean what what package are you using. How are you putting the colors and so forth. They either discussed with their peers first, if not with their SMEs and group leaders, if not, if they can do it they then with us the leaders that off the program. Next slide please. And so we keep going with different assignments and increasing complexity as we run along. We end all the way as we, as we're doing this we add all the way to our markdown at some point, and we taught them how to do our markdown so it became very advanced actually very quickly. Okay, I can put this in the chat later. We actually just published this is a shiny application in shiny apps.io. We put all the content we use for this training public so you can go there take a look, use it, modify it and do whatever you want. It's available for everyone to do something similar to what we did. Next slide please then. So what's the incentive. So there are actually it's focused on two pieces one is for the crowd themselves and the other is for the community the cohort is being trained right so first of all as participant of the crowd. You have leadership opportunities that you can that you can now work on. So diverse mentoring, which is really interesting for, for getting people to become especially our high potential employees to become, you know, known as experts in something. You can further develop your own skills there is nothing better than teaching other someone else to actually learn something and really stick with and really know how to do it for the community. We were able to create this knowledge networks that people are really talking to each other on how to do things. The content that can be reused. We actually, you know, put videos, record all of these put all of these in shared sites that can actually now be used and reused. There's a right high retention of the learn skills. And we did have about two thirds of active participation of people that actually were using this. Of a very small 30% of people that learn that new are of the people that participate in the end almost everyone were at least beginner. So this is a sample survey that we did on the graph on the side there. Next one, then please. And not only that more importantly, about two thirds of the people that went through the through the course actually were using this on a day to day basis. So that was very, very interesting for us very interesting metric for us. Next slide please then. So some learnings at first is, it's very overwhelming, even for the people and it's a lot of load and you have to really, you know, change the mindset even how you, you learn things that is not just a classroom that you go there, you actually have to go after the knowledge because of that we initially focus on very small steps and we're really focused on visualization, but we really very quickly morphed into something that become more complex, if even went to arm up down, even went to shine dashboard shine dashboard shiny in the end of the We, we, you must have support from upper management because again it's a change of mindset that has to be done. And what we did actually in the end we, we, we put a finals, a finals exam for everyone so that we could continue this is education that we actually ask people to create some top line reports templates using arm rack down for their day to day work and and that they went on and we keep working for another six months after it was done on this particular exam so all of these can help on on creating this new mentality for training. Next slide please then. And I think I'm giving you the floor now then. Thanks, power. So next I'm going to discuss our experience conducting a more traditional classroom approach. Now, when I say traditional I mean 2020 traditional and, and that means that we were planning for this to be face to face, but quickly with everything else in 2020 we needed to switch over and pivot from from face to face to virtual and that really inspired us to rethink how we were going to maintain engagement and how we would utilize technology to do that. Now, just a few seconds about our audience so our audience consisted of about 300 programmers across the globe, and very experienced in programming concepts, but almost exclusively sass very little and in most cases, no or whatsoever. And another thing to keep in mind with the audience is that they're very focused on timeline so we're talking about a population that is constantly faced with database locks submissions, health authority requests so so they really don't have a whole lot of time to play they need to get in they need to learn the material they need to apply the material. So, our challenge on a high level was to teach how to use our to program and or qc statistical analyses. But but really what we're talking about here is a fundamental change in mindset from the thought that I'm a sass programmer to I'm a programmer developer who uses sass and are potentially other tools to perform statistical analysis. So, how did we piece together this strategy. Well, the first piece the puzzle was to identify the core concepts that we're going to focus on, and we pretty quickly narrowed into several key skill sets so we wanted to be able to use our to read in data to transform data to summarize data, and then ultimately produce formatted output of the summaries. Once we have those core concepts, then we could turn to our and identify those are packages that number one supported those are those core concepts, but also just as important to identify packages that were likely to be part of the Janssen validated ultimately. So, once we have the core concepts we had identified the packages we had our curriculum, then we needed to recruit a training team, and in that we wanted to keep in mind not just the our skills but also folks that that had training skills as well. And then finally, it was time to develop the training, and there were a few key guiding principles that we wanted to keep in mind when we're developing our training. One, we wanted to make sure that we gave these people quick wins that very quickly see how they could utilize these these concepts. We want to keep the examples and the scenarios very relatable to their job. And again, especially given that quick pivot to virtual we wanted to make sure that our lessons were developed in a way that kept people actively engaged, even if they were in a remote setting. So, here I have a little bit more of a deeper dive into our plan curriculum. So we started off with a general introduction to our, and keep it in mind those key concepts are our goals for our first set was to understand and use our studio be able to import data using the Haven package, being able to transform data and provide statistical summaries using the d plier and tidy our packages produce basic formatted tables and listings using huckstable, and also being able to produce basic figures using ggplot. So, we, we had given that foundation, then the next step was to add the a training on the internal statistical reporting package providing the users with a standard approach for producing highly formatted statistical some reason in an efficient manner. So, finally, to build on that foundation and start to add additional tools to the the our toolbox, so to speak, be that data manipulation tools such as dealing with dates or strings or diving into statistical testing or even building their own functions. And then we have the curriculum, as I said we needed to put in that next piece and develop or find the training team. And so of course, part of that is finding people or developing people with our skills, but then, again, focusing on this, this extra need for training skills, given the virtual nature. So we needed people that were going to be good communicators with good time managers, good listeners. We're going to be creative in how they put these lessons together and also patient. So, a lot of our course was going to of course have the traditional chalkboard method of lecture and demonstration, but again, being virtual is important to be able to leverage other types of teaching styles whether they are team based or group based learning approaches or game based approaches to to filter into the lessons to keep people motivated engaged. And then finally, we needed to make sure that people had the training skills and using these these tools for remote learning, whether that's zoom or MS teams or anything else, you need to be able to send people off into breakout rooms and manage that if you want to do a group learning style. Use things such as poll everywhere to to have some interactivity and polls and formative assessments. And then even something as simple as managing participation. You can't have everyone off of you talking all over for each other, but you do want to have that participation. So utilizing something like the chat or the emojis to make sure that that that folks are continuing to participate. And just a little side note on that there are a lot of different organizations out there vendors that provide training, not just on our, but on training in our. We happen to utilize our studios instructor training certification program to achieve some of these skills. So, once we've identified the curriculum and the people to teach it then we needed to develop the actual training. And again, in the spirit of actively seeking ways to to keep people engaged. Here we have laid out just a few of the things that that we did to keep our lessons moving along. One is class polls. So these allowed us to gauge understanding but also just as importantly keeping the students active in the lesson and not just watching. Then we also utilize guided exercises and there are a number of different approaches to guided exercises here I have an example of a fill in the plank. But the idea here is to allow the students to reinforce their skills, but do it in a very quick efficient way where you're focusing specifically on the given task at hand, rather than a full blown exercise. We took every opportunity that we had to in the lessons and in the exercises connect these new skills back to existing are sass skills. Since that was where the great amount of experience was with the students. And then finally I'll just mention that while we used a lot of guided exercises for quick hits, there's still a place for for doing full exercises. We just tried to be judicious with it and and generally we use them more as a as a wrap up to a lesson. Other things to consider when it comes to keeping your lessons moving and being efficient. The programming environment, so you want to make sure that you figure out a way to so that the teacher and the students are all on consistent versions of our and our studio and are using a consistent set of packages. You want to make sure that you're not spending time in your class debugging, you're going right to dealing with the, the content that you're, you're trying to teach. In the, in the same manner you want to work with familiar data and familiar scenarios where possible. You don't want to have students spend their energy learning the data or the scenario. You want to understand it quickly, be familiar with it so that they can focus their learning energy on the topic that you're trying to teach and not the data that you're using. And then a couple of things that are a little more obvious but still important class size, especially in the virtual world. It's very difficult to manage a large group and keep them engaged and and not have to class get unwieldy. Generally, we found that at about 15 to 20 people, it became difficult to keep everybody engaged include keep the class moving so think about the class limit that you want to have. And then also class duration and no matter how interested the student is and no matter how interesting your, your content is eventually everybody starts to look at the clock. And then you have to figure out where that limit is and we found that most students preferred something in the one to two hour range beyond that it started the interest started to wait and then the activity started to wait a bit. So some outcomes with our traditional approach so we, we trained, excuse me, approximately about 300 individuals. And if we look over here on the right, we did ask the students to provide their confidence level in performing certain basic tasks in our that we were training on, and whether they would be feel they could definitely achieve that task, probably possibly probably not. And in the salmon color you can see where their results were where their responses were before the training, and then in the blue you can see where the results were at the end of the training and saw a definite shift from probably not to definitely not to probably indefinitely so so felt really good about the end result of our lessons down here in the lower left you can see some of the sample comments from from our trainings and getting feedback. Some general themes were they really like the examples that were related to the job. They appreciated all of the comparisons to SAS, and they really liked when we were interactive and mixed up our lectures with with an opportunity for them to try things out in a quick efficient manner. In terms of some constructive criticism. They did ask for shorter lessons. Usually you're heard one to two hours was the ideal. And then the constructive criticism was really along the same line so they they liked the SAS to our comparisons but they wanted more, and they liked the where we use the clinical analysis data and the training but they would like even more. So some some very common themes there. So now I'm going to take a few minutes and just touch on e learning. And the question is, when might you consider e learning as an approach so so there's a couple of things that you want to consider there. So maybe you don't have the in house expertise, and that could be our expertise, or it could be expertise in training. Or maybe you have both you have people who are who are fluent and are and really good trainers, but you just can't spare the time or resources to build an internal training course and administer it. These are all situations where you might consider any learning option. Another time where you might consider an e learning option is if you have a small group that you want to train. And there it's a very homogeneous where they're independent learners. Then that would also sync up well with with an e learning. And then finally, it's not an either or another option might be to have an in person or in house training and supplement that with an e learning course. Now if you do go with the e learning approach. There are some things that you would want to think about when you're selecting the e learning course that you're going to have. One is, do you want to a pure e learning approach or do you want to hybrid, there are there are e learning courses out there where it is strictly you and your computer. And then there are others where it's more of a hybrid it's e learning, but then the vendor offers open office hours where you can go and seek additional help. Secondly, how does the platform provide our feedback and this sort of ties into the first bullet point is some platforms will will evaluate the code as you're going through the exercises and provide instant feedback or whether you did them correctly or not some possible hints. Other platforms will offer up a link to solutions and then again offer open office hours to go discuss additional questions so so that's really a preference on what you want to use. Again, the difference between, do you want a set training timeframe where your students start taking the e learning at one certain point and ended at another explicit point, or by e learning are you looking for more of an open ended resource where individuals can go and just sort of a just in time type approach where they log on look for the topic that they're interested in in that moment, learn it, and then go apply it. Number four, think about how important it is for you and the learners to have relevance to your daily job in the examples and the scenario is used. Because that's there's a big variety there as well in the e learning platforms, some data is extremely generic, and there are platforms out there that specifically look and use examples for clinical trial data. And then finally, and this sort of fits into the first four. Are you looking for this e learning for your entire group, or just a small group of subject matter experts is the group homogeneous or is it heterogeneous because that'll weigh in on your decision as well. All right, so those are the the two additional training options that we looked at in in Janssen. Now, getting beyond training you've completed your training and regardless of the method, then you need to use it or lose it right so here are a few things to think about a few ways that we offered additional practice reinforcement opportunities at Janssen so the first one was capstones. So we did develop for internal traditional learning comprehensive assignments meant to sort of give an overview and report reinforce all the topics presented in the training. So chose selected teams to function on on pilot studies. So they basically mirrored an actual CSR effort being done in SAS, they program the outputs and are and had the opportunity to to compare what they had done to what the trial did. And then for everybody, we encourage the use of our in the QC of tables listings figures and really focusing on those topics that we we covered in the training and using those in their day to day jobs. And of course if you're going to ask people to practice you need to offer support to them and answer the question so a couple of ways that we did that. We did offer open office hours, so the trainers and the subject matter experts did make themselves available or pre specified times that people who had questions could log in and ask those questions or log in and listen to the questions other folks had. We also had a 24 seven help desk where folks could go in post a question, the subject matter experts would answer those questions online the questions and answers would be posted. So the next person could come in and look to see if they had a question had that question been asked already and answered. And if so, great. And if not, then they would be able to post their own question, and that way, building up a database of questions in the community. And finally, we, we had a series of recurring meetings that we call our walks. And this was really split into two sections the first section being a pre specified training topic, given by someone in the community. Usually 15 to 20 minutes, no more than that a small topic of interest. And then the second half of the lesson was just an open forum for people to ask questions and discuss any issues that they were coming across. So, with that, I'm going to hand things over to Guy three who was going to talk about how the trainings the practice and reinforcement, all of this fit into the larger change management plan. So it was awesome to hear the instant outs of our adoption through Diane and Paolo, and you can understand all the challenges are pressure tested with the mitigations that they just presented. So to start off with the presentation I want to say it's not the strongest of the species that survives, nor the most intelligent. It is the most adaptable to change, like Darwin coded. Next slide please. So, that Paul already alluded saying this is a transformational change that we are talking about so two years before when we started doing this we evaluated the forces for the change. There are other things that we already have available. And what are all the challenges we're going to face, who are all our audience that we're going to character, who are all the partnership we're going to connect, and also what are all the connections with like connections which we already have that we're going to leverage. So we evaluated this whole thing that I just stated because as you know, this is an open source technology so there's a lot of knowledge that's available so we have to evaluate what we can use how we can use and how that's going to be available for And also, there was quite a lot of transparency with this whole are and also the other open sources where it was all globalized people were ready to give out all the core. Everything was there, but there is also some level of privatization that we wanted to do so that we can customize it for our people's need. Of course, this was digital convergence we had so many things going on at the same time. So when all these were changing, how each organization took it into account was very important to evaluate at this point. So here what we did was like we socialized with our leadership team. Of course, our leadership team was very confident with what we are doing they were more proactive than what we are. They also encouraged our creative confidence so which you can see with all the creativity that our people came up with with our trainings and our rollouts and also the supporting mechanism that was set up. As we were trying to do our training awareness is we also did a stakeholder analysis to understand who are all the stakeholders who's going to be affected and who are the ones who we have to provide with like awareness knowledge to understand what we are doing here. And so we did a good impact analysis to understand not just our our training and planning and communications but also how we can roll out and how we can to sustain because sustainability was one of the biggest question here because it's not like as Dan said it's it's not like you teach and you leave it. Because if you don't use it you're going to lose it so we also have to see how we're going to sustain in here. So, the other than having partnership and also other than having connections we had like small group trainings are hyper care support or SME support the walks and talks, which Dan just shared with all part of our planning. Next slide please. So here comes the big picture, apart from doing all the monitor planning all the operations, all the logistics of training 300 people with 80 sessions you can imagine in a global company, which is in three or four different regions with all the time zones, how do we even manage and then came out COVID. So, all the plan that we wanted to do in as face to face turn into virtual. So now we have to go learn zoom we have to go and learn surveys we have to go learn everything, even before we go teach. So the challenges were like paramounted in front of us. But since we had a bell plan already, we were able to execute in short modules. So this plan didn't just start from top down or bottom up, it was all over the place, which means it, it was embedded in our GNO. It was the mission, like what follows stated already, where each person had this in the GNO. And the leadership is very cooperative for us to implement all this in all blocks. So, apart from the business strategy we also were very realistic, we know this is not going to happen in one calendar year. So we had it as a multi year initiative roll out, which means we train people in this year we implement through piloting next year. And then we see the return on benefits in future years so we were very pragmatic in applying the principles that we practiced here. So in our second block you will be able to see how we did it inside our company and also outside the company as I said this is a very transparent forum. Yes, so how can we leverage all the knowledge that we are doing so we had a very good team, which include Dan and also Sumesh who always had ear to the ground to understand what's happening outside to bring things inside through their external engagement connections with our leadership who also poured extra knowledge to us. We were able to thrive through our trainings through our experimenting. Yes, we have a great kinesthetic learners. Yes, we all are programmers right we all want to experiment and then we all want to learn from our experience. So it's not like any other software courses you can just teach to them, they want practical experience so that is what we provided with them. And also we did this as a multi prong initiative, which means there was a training team there was a pilot team, everybody had divided and conquered it and it was more like a matrix management. So you can see all the small pieces that goes in here was not done by just one group it's a village that took to do the training here. And of course, everybody is different in learning, we have visual learners, we have people who have challenges with learning through, you know, through just videos they want, they want materials to learn. So we leveraged all the other forms and means of learning so that we can categorize and as well as cater or all of our learning people here. And of course, communication was the key. It's not a checkbox word, you cannot go and say, here we go we rolled out our we're all done, it's never going to happen because the changes not going to stick. We have to teach them at least three times for things to stick but our people was really really good in in claiming to the new new situations and also running along with the change. And for which our communications that the thorough communications and they have to be part of change they have to own the change, otherwise, change is not going to stick with them. And of course there will be some resistance management why because you have to imagine these people have experiences like 20 years or 10 years in SAS, and moving them off of SAS and introducing a new language to them is going to be challenging because for them everything is going to be new. So we gave them an opportunity like 5% 10% of the time, so that it's more fun for them to get involved and some are like learners some are like teachers. So we also give them opportunity to be like mentors to be like one of the part of the train the trainer somewhere like people who love to organize these meetings so everybody had a share to contribute in this whole initiatives. And of course, as we are building the ship you also have to move so there are other challenges that we had there we had certain studies which even we're trying to provide outputs in our. So we had some regulatory touch points we had some lesson learns so everything was in a close loop. So the communication was very thorough throughout the whole process. So we kept the whole group replenishing replenishing with all the knowledge that we came across. Next one please. Of course, it's not a start to end a happy story. Yes, it was a very winding road. We planned. We had our team assembly. We had our governance so we know what we are doing what kind of metrics that we are going to calculate how is our access controls, how is all the systems going to work. The initial challenge was really really hard to manage with the training, but we still tried that and we came out of it and rolling out. Yes, that was another thing because, as you see the roll out didn't happen like on 30th of September we're rolling out are it's a process, it's a journey, and that journey comes with so many changes and all those changes has to stick to the people. We usually say well planned is half done, we got it all well planned so that was half of the work was already done and the rest is only implementations and carrying it on to the, our world where we're going to increase our engagement our pilots in future opportunities. Next slide please. So finally a few take away when it comes to training it's not one says fit all like I already said we have to identify our goals, no more audiences, we also have to have our resources in place because it cannot be like a boring. classroom teaching it should be very interactive. Second thing keep your training again interactive related to job, otherwise your people are not going to stick to you. Third provide your audience with opportunities to reinforce the learning. As soon as they learn they have to start implementing it, otherwise it's so hard to stick. So provide your audience with post training support like Dan said, apart from the e learning support we had we also had red disk, which means it has a repository of Q&A that was built. And then it was so interactive like anytime you go you ask a question it'll tell you whether that question was previously answered, if not, there is a support group who's going to answer this question and it's going to come back to the system so it's constantly building this knowledge there for us. And again training is not the only aspect you saw that in the previous slide or previous but the two slides ago. It's the whole process that has to be planned strategically, tactically, operationally and implemented all levels because this is not just an awareness from people's perspective, it is also awareness from sponsors perspective which is a leadership team. So everybody needs to know their goals and objectives, and that's how this whole ship runs. With that, I think I'm handing it over to Sumesh or for any questions. Thanks Dan thanks. We don't have a lot of we're a little bit over time we have just a few minutes for questions. So mesh, if you can give us some of the questions. I think many questions on the chat, basically one of the questions Julia raised is the shiny app training. She wants that link. I think we already have that in the presentation. So maybe something Paula if you want to share that. I just, yeah I just posted in the chat right now. So, so yes feel free to use it any way you want. The only disclaimer is it's still a working process we are still developing that app a little bit. Not so much for the training so. Perfect. Thanks and Hona, I think you have a question. I've been wondering what is your experience of thought in terms of predicting which of existing base on our crane or packages is highly likely becoming part of the validated environment. I think this is a little bit out of a scope on this question. But if it is more related to the training. It's basically what what Dan and Paolo has alluded that we use like tidy, tidy words or ggplot. So those are the packages that we feel that will be a very useful in the training environment and also I see Mike as talking about using learning our package that will be a great package for developing any kind of training tutorials or sessions actually. And I have the last question my gosh from my gosh can we get the recorded presentation so the recorded presentations will come there will be a link in the same site that we have the registration so links for the webinars will be there one comment about the validation is we have the first webinar with GSK and Nichols presented, which is already posted, as far as I know, in the website. So if the auto information about how GSK is approaching validated environment. I think that we have taught about this packages as we're really used to the environment in the environment validated environment. The difficult on predicting this is that people are going to have different environments right so that that's something that you have to really looking to the your environment that you're going to be using. And as Dan was showing, we have even our own package that we're going to be training on for for presented for submissions right for creating some of this this this material. So you have to look into it from from that perspective. Just as a quick comment on that. I know it's a little off topic, but that's actually been discussed since just started I think this week between Russian of artists and GSK on creating validating a common pool of cram packages will be available to all farmer so that's something that's, you know, it's a conversation that's happening right now basically. I think that I think that the next webinar is going to be about packages right and you guys at Russia is doing that right. Yeah, yeah. Stay tuned for in two months we have another. I'm sure Karen is going to be there. Any other questions. If not so much, I think you There is one more question coming in the chat does Janssen currently have a sort of our only submission in the pipeline. I got a question for you. I will redirect to Olivier Olivier is on the call so Olivier, do you want to chime in on that. I didn't catch the first spot was like do we have a Does Janssen currently have a sort of our only submission in the pipeline. Completely study submitting in our in the future pipeline. So that's the questions I think a lot of the young son staff is asking and my answer is always like, not near term. Because, you know, a submission is not made of one study, maybe accepting oncology and it's less and less true it's made of combination of studies. Some of them being done today some of them being done 10 years ago so I think for the years to come they will still be mixed packages or mixed of SAS and are. I think they also some situation where we might decide to stick to SAS for some specific activities. But, theoretically, I don't think there's anything preventing it and we're seeing it with the, you know, emergence of Adam packages TLF packages there's most likely a way to do it 100% in our, if you wanted to. Thanks. With that, so much I think you should start with the panel. Yes, I should. So I would request all the panel members to switch on their cameras and when I'm kind of reading your bio intro, just raise your hands so people might know that who you are. I'll go with introducing the first member so Amy Gillespie is an associate vice president of Mark and co where she hits the global statistical programming organization. So Amy has been with Mark for 24 years and she's previously worked on icon research as a bio statistician. She also worked on potential insurance as a quantitative marketing analyst and also she was an adjudicate a professor for us in as college actually so he me as a MSN applied statistics from Villanova University and BS in quantitative business analysis from Penn State. Welcome Amy to the panel. Thank you. I'm going to introduce the next panel member Olivier Laconte. After obtaining his MS in MSC in mathematics applied to informatic engineering in 1998 Olivier joined the former industry specialized in software development for SAS applications for biometrics team for both Sanofi and In 2016 he moved to Roche UK as a statistical programmer and analysis associate director, taking on the responsibility of statistical programming team in immunology and Europe and neuroscience. During that time Olivier had several different roles within fuse organization, including 26 EU Connect Chair and EU single day event director. In 2012 he joined what is pharma as an executive director, where he build a statistical reporting group to support the general medicines pharmaceutical portfolio. In 2017 he took the lead of data operations development unit team, coordinating all data management clinical programming and statistical programming activities for Novartis. And development as well as supporting the launch of Novartis clinical data science project. Since July 2018 he is the head of clinical and statistical programming and analysis at Janssen pharmaceutical company or J and J. Welcome Olivier to the panel. Thank you so much. Let me introduce the next panel member, Mike Smith has worked in the variety of roles at Pfizer over 28 years is currently a senior director and statistical statistics providing specialist compute computation and modeling solutions to the project evaluating new tools. He's in our studio certified tidy verse trainer and a professional argue. He was previously in the similar role for pharmacometrics modeling simulation group within the clinical from ecology at Pfizer. He's in almost 500 Pfizer colleagues on tidy verse in. He is the author and collaborator of many manuscripts pertaining to model informed drug development modeling and simulation and implementation of tools and workflows to assist this. So welcome Mike to this panel. Thanks. The next panelist Karen Martin has been at Roche for six years working across a variety of molecules and projects as statistical programming. During that time and before he had been advocate for the benefits of using our more recently he has directly taken on the role to influence the future direction of how PD data science at Roche will be using are in the future. Welcome Karen to the panel. Last but not least Michael Rimmler is a director of clinical programming and innovation leader in the technical excellence and innovation group. He has 12 years of experience reporting on clinical trials providing both technical and analytical support to programming team prior to joining GSK in 2018. He has worked on multiple CROs and served as a faculty for Xavier University as an assistant professor of economics. In addition to leading innovation activities within clinical programming. Michael serves as a primary business lead for the integration of our into GSK's clinical reporting process. In this role he oversees activities, driving the use of our for independent QC, developing standard reporting tools in our and exploring the opportunities for using our for the generation of trial results. externally Michael is also co lead of use working group on multilingual clinical reporting sub team lead for transfer rate modernization of statistical analysis project and co chair for 2022 fuse us connect. Michael to the panel. So we have a great team. So let me start with the question in the interest of time so we all know that very interesting topic was presented so my question is more about. We have a concept of build by or hybrid both build and buy kind of thing. So what is your approach when creating a robust training strategy. And also the second form of this question is what is your experience with the challenges or the benefits when when you when you go for this kind of approach either a build by or a hybrid. I'll start with me. Okay, great. Thank you, Samesh. Great question. So, I'll talk a little bit about our strategy that we've employed here at Merck and it's not so different than what we heard described by our presenters from Jansen. So we basically pursued a strategy of both building and buying. And the reason that we went this way is we recognize that a multidimensional training strategy for our is is most valuable because everybody has different learning preferences and different needs. And, you know, really recognizing that not one size fits all. So we pursued an internal training approach where the training was provided by peers so peer to peer training and we develop three different levels of training. So, supported staff to use Coursera. If there was an interest on more of an on demand training approach, and we kind of pointed people to to take a course that was very similar to our internal level, level one training course through Coursera. We're also currently partnering with a tourist to develop interactive and on demand training, again, target specifically to our needs and very similar to our level one peer to peer training course as well. And then we also made additional resources available to our staff, whether it be, you know, different books, webinars, we had CES to our to our cheat sheets developed. We heard that in the presentation earlier today how people wanted to understand how to do things and are that they were more familiar with and doing in staff so we made kind of a cheat sheet available to our staff. And then we also have a very thorough web based training pages with a lot of different content where people could go and learn and seek information to resources as needed. Okay, great. Thanks Amy. Olivier, can you share your experience from Janssen. I think guy free, you know, went into that strategy in detail so I would not have much more think on the experience. What we've learned is training is just one part of it. The biggest part is the experience, you know, the number of time I use that phrase. It reminded me of, you know, you remember people are, you know, when I graduated 1998, my first SAS training, I would have been absolutely unable to create another data set. And I think that's the same challenge people are facing. They're going to an art training, but that doesn't mean they can build. So I think what we've learned is, don't be afraid to take the training few times, practice, offer different type of practices, and then, you know, we're striving especially next year to get to, you know, the real work and not just to see work or initiatives. And I think, you know, what we're doing today also for me is very important in terms of learning and sharing with others. Because behind our eyes just the technology the real challenge is the open source mindset, which I think we are all adding to and I think that's what will make the difference. And basically the presentation clearly stated that we went with the hybrid mode of both buy and build kind of things. So same thing like for e-learning we use data camp and things like that. So pretty much I think what Amy you mentioned, we are pretty much did the same kind of thing. Right. So, Mike, do you want to share your experience? How far, sir? Yeah, so so far we used to buy in training from a consultancy group. But since I became certified, I've kind of built on what's the good work that's been done by our studio education group. So they post all of the master the tidy verse coordinate course training materials, amongst others, you know, many, many courses are up there. And so what I've done is to kind of redeliver that training internally. I think the next step on from what I've done is to then adapt that so insert more relevant data sets or tweak what's there to make it more relevant based on what colleagues are telling me. But actually, I think, you know, starting there with the generic example data sets like the Palmer Penguins data set or whatever it or the MT cars data set. That gets people through the door. I think what I need to start thinking next is how do I augment that either through learner tutorials or through flipped classroom where I can say, you know, we have the materials we have the videos of those training sessions. Review them. Now let's come back and then look at, you know, a real life example data set and try and apply what you've learned there. One of the other things though is it seems to me if you buy training, if there's been a cost involved to an individual or to an organization, colleagues are more likely to turn up and pay attention. You know, so it's actually quite tough if it's an internally delivered training for them to kind of switch on and go, oh, I'm not paying for this. So it doesn't really matter. I'll just do this email or I'll just do this thing over here. So there's that kind of importance. I think that the transaction brings and it's interesting to hear how Janssen and Johnson Johnson are handling that kind of, no, this needs to be something that you devote your time to. And so you're allowed to take the time in your day, in your day job to actually learn. Okay, that's that's great. I couldn't you want to share your experience from Roche. Yeah, sure. So I think we have gone for a hybrid approach but I think we prefer in house and certainly I prefer in house where possible. Because you get a lot of advantages from that you can deliver in house training on your schedule a bit more easily, more point of need. And you also can use your own environment in your own data which is sort of relating to what Mike was just saying there. And also to what Olivia saying is that training isn't the only part so you really want to target people who have a use case have something something they can apply the training you're going to give to them. That said, you know, we've we've used various platforms, both in terms of like online training and also, you know, directly given training as well in targeted ways, where it makes sense. I'm definitely sort of leaning towards in house training where we can do it that the challenge is always in kind of resource and that I definitely agree with Mike that it can be the case that sometimes you don't quite get the audience you might have otherwise. I'm sort of trying to, I don't quite have the answer to that one yet. I think the main one is just to sort of bang the drum about how is becoming more important so people are incentivized to go on these trainings. Yeah. Anyway, that's great. That's good because that's what that's what guide three presented in a slide that the mindset needs to be changed actually that's been very important. Michael, Michael, your experience with GSK. Yeah, so GSK we, I think we were almost exclusively a build shop, you know, we did everything in house. From a fundamentals perspective, our statistical data sciences group had a couple of core training modules that were available and instructor led training and delivered that way from in terms of serving those that are working on gxp type work. We still focused on the user. So very early on in our process we recognize, as others have mentioned that the experience matters. We wanted to make sure that they had an on the job opportunity to use what they were learning coming out of those instructor led courses. And then we wanted to we focus their efforts on supplementing that with resources, we built our own internal, what we call a guidance document that had real use cases. So we we created a synthetic kind of for Adam domain database of like 24 subjects completely synthetic, but we could read that in and generate some of our templates. We could write scripts give them example code kind of template code, but give them some place to start with. You know, we looked back at how we learned SAS and said, you know, just like Olivia was saying, you know, we, you read a book you take a course you take a test but you can't really do an Adam data set so how I learned and became proficient in SAS was by looking at other people's code and tweaking it to fit. So the inputs got to the outputs that I needed. And we took that approach. And we built, I think a pretty good infrastructure and then offered other opportunities, leveraging things like this. Recently we looked at at our book club that modeled after what my again delivered at the R studio global conference last year. We have our coders corner kind of a monthly exercise that gets people involved and engaged might not be a lot of teaching and learning but it gets them engaged with and gets them in the door so that they then want to learn. And other things that we've offered along the way that supplement that instructor led training for the fundamentals and gets them to close that gap so that when they're doing sort of on the job type activities, they continue to learn and feel like they have the support to to to progress. Yeah, I think that's great actually. Just a quick question maybe this is maybe can take 30 seconds to answer but it may be a lengthy one also. What was your experience on the resistance on the programmers basically SAS programmers moving into our, and how do you kind to manage or address that resistance. Mike, I'll start with you actually, I'll go random. Um, I guess I haven't seen. I'm not working directly with the statistical programmers. Um, but I think it's I take the point made it earlier that they're often more junior colleagues who learned our in college and learned, you know, tidy versing college and things like this. It's it's kind of the old lags the people that learned base our ages ago and are going oh I don't need to learn this stuff this new fangled way of working I'm perfectly, you know, efficient today using this. And, you know, that's a hard one to overcome. There's the kind of flip of well, you may be able to do all of this and base our and that's lovely, you know, but actually the person qc in your work is more familiar with the tidy verse way of doing things. So, you know, if you migrate towards the harm harmonized way of working, then it will make you may struggle to get there, but it will make the qc much quicker, or we could, you know, have a more generic script for doing this. So it's the the resistance is is just sometimes about efficiency. You know, I get my job done today really quickly using the tools and the methods I know, why should I learn this new thing over here because that's uncomfortable and I need to learn it, and I'll be less efficient in the short term. But that seems to be the kind of general feedback here. Kiran, you work closely with the programmers actually so what was your experience on that actually and is there any remediation that you did to overcome that. So it is tricky. I'm aware that there are some people who are sort of quite resistant to it. The funny thing is is that typically those aren't the people who talk to you, that the people who talk to me are the people who want to learn and excited to move into picking up and I would say that's the majority of people at this point, you know, I think what has brought people over because there was definitely more resistance initially is some clay use cases so for us that was our shiny was a big sort of wedge in the door because it was a kind of USP it was clearly something that would be very difficult to replicate in SAS. So it was, you know, just a nice thing saying oh look you can do this and you couldn't do that in SAS so that was a good way to sell it to a lot of people. You know, I think it's harder, you know, I would argue that there are console, a lot of problems, or at least as well as SAS, but those those can be harder to demonstrate that something that is not impossible in SAS but extremely difficult to do. So that was, yeah, I would say a big way for converting. But yeah, I mean I'm still, I think to a certain point that there'll always be like a small pocket of resistance, so to speak. And that group you bring along by again, bringing the work saying that more people are working in R, you need to keep up with them. Olivier, your experience? I think three type of resistance. The first is why? Why are we doing it? You know, SAS is fine. Why changing everything and I've been working in SAS for 25 years so why changing it? So I think that's the first type of resistance we've heard. The second I think Mike was spot on, efficiency. I'm very good at what I'm doing. And you want me to deliver, you know, top line result 24 hours after that. If not less old CSL puts in five working days, but now you're asking me to move from my old Porsche to brand new Tesla, but I don't know how to drive it. So I think, yeah, that efficiency is there. And the third resistance which is the one which always, that's my favorite. Or will the FDA accept? What is the regulatory? Is that validated? And I'm like, have you been to SAS lately? Have you asked them to open the book and show us how they validated the software? I don't think any company has been doing that in the last 20 years. So I think that's a free type of resistance. And I think the solution to all of them is being patient, talking to people, work with them, show them it's doable and then not rush it. Because, yeah, you know, if you've been for five, 10, 20 years using the same language, turning into a multi-language programmer is text time. Yeah, it won't happen overnight and not rushing is the best way to put Michael, your experience with GSK on that? Yeah, I couldn't agree with Olivier's points more. I think to add to that, you know, we've, I think we've taken this approach of trying to support programmers and statisticians from below while providing incentives from above from a leadership team perspective. So a lot of those incentives come through writing business and development objectives towards the adoption and the use of R for delivering projects and delivering work. And the resistance then, which has not yet been identified that we also have seen is timelines, right? So you want me to deliver faster, but we still have to actually do the work. I don't have time to do things in R and learn this. And the other bit is, well, you know, we're on a phase three study. The only thing we have delivered this year is going to be a phase three study and we've got all of this historical SAS code that we can carry forward and copy forward. It's going to be much more efficient. Why do you want me to recode this all in R? Just to learn R. So what that's done is had us really, on one hand, take a pragmatic approach to what it means to accomplish the objective, which really is towards the proliferation of our expertise and capability within within the group. And then secondly, to think of how we write those objectives so that people can be successful in delivering to them, while also being able to deliver their projects. Yeah, that's a great point. Amy, any experience you want to share from Mark? Our experience, I think, is very similar to what the other panelists have mentioned. You know, around the efficiency idea around the ability to reuse all of this historical SAS code and SAS macro libraries that we spent so many years building. I would say it was not so much active resistance, but more just questioning and trying to understand what the department strategy is. And I think that there was better understanding once we more clearly articulated our strategy around the adoption of R, and then showed how we were going to support our staff. I think, you know, it's been an evolution, but I think with any change. And this is a change that it's natural to start questioning and trying to understand better, you know, why we're doing what we're doing. Great. I think I have one more question with interest of time. We all work with the CROs contractors and other companies actually, and some of us work with same CROs actually. And we also partnered them in our development activities. So, so Johnson giving a training, or maybe Pfizer giving a different kind of training but it's objective is kind of generating the reports and things like that. And the common goal is same, but is there a thought for developing some kind of a common curriculum and sharing it in a way that it can be beneficial for the open source community or even the CROs and contractors that we work with. I'll start with Olivier. That should be our target. And I think that's why we have this forum. At the same time, you have all the other layers, you know, that comes in, you know, procurement, contractual, co-employment, all those things. So, I would say if, you know, I'm really excited by the work we're doing with Rosh and GSK on, you know, admiral and other things. If sponsor company can achieve it, then I think we'll be able to drive the CRO part. But I feel until then we just, we already, we need the CRO, the sponsor companies to achieve it. Because I think if we don't do it first, then it's going to be very hard to go back to our vendors and say, hey, you know, we agreed to use admiral or another package that is out there. It's going to be super difficult. So, so that's part of this long list of things where we need to change the way we've been operating for the last 20 years. So I'm not, I may sound pessimistic, I am optimistic, but I think there's a predecessor, which we need to hit is three or four pharma companies agreeing to use the same packages and producing exactly the same tables of adverse event. And not the one with, I prefer it bold, I prefer it in footnot here or footnot there. Yeah, that's a great one, Olivier, because once we as a pharma companies align, it's easier to bring others into that path actually. Michael, any, anything that you want to share on that? I mean, I think if I have any reputation in the space, it's my commitment to open source. So if we can share the burden of developing and delivering training, particularly to our external partners, I think that that's a win-win for everyone. You know, it is a challenge if you want to start to require certain capabilities. We know that the, the intersection of domain, clinical domain knowledge and our capabilities is small but growing. And so to find those, those few individuals that are out there is challenging. So if we can share that burden, I agree with what Olivier is saying again, but if we can share that burden, then I mean, I'd be open for it, of course. Great. Amy, you want to chime in? Yeah, so we are not specifically training our CROs or contractors at this moment. We are focusing on our internal staff currently, but I'm supportive of the idea. I mean, I think it's a good idea. We did, what we have done with our functional part sourcing partners is we have communicated our vision to be very clear of where we're going in the future. And it's in their best interest to start thinking about upskilling their own staff and some of our preferred partners have already done that. But you know, as far as a larger industry wide effort, I think it's a really good idea but right now we're focusing in on our internal staff. Thanks Amy. Kieran, you want to chime in? Yeah, sure. So we've actually taken quite some approach to Amy and in that we've sort of communicated to our partners that we're going to be using our more and we're going to need resources that need to know. We went as far as sort of putting together, not really a syllabus but kind of a list of skills that we were looking for people to be in. So kind of like this, we have this statistical program, a rating of like how we rate our series. We did it with our four various different things, skills we might expect them to be at certain levels in to help them tell us what level that they're at. We've had sort of, we've had some success but it's challenging because they have the same problem that we have in but it's kind of exaggerated in that, you know, they deliver training but then they don't necessarily have a project using R2 attach a partner to. So I definitely agree going back to the collaboration point on that. But yeah, I mean, the biggest part will be as Olivia says sort of coming together because obviously at rush we have a bunch of in house packages we're using right now and I'm aware that everyone else has any help packages. So if we can try to harmonize on that, that would be really great because I think it's always going to be a challenge if you're using two different solutions. I agree on that because if you're using tidy voice. Yes, you can commonize the strategy but if you're using a customized package that cannot be common to everyone. So Mike, what is your experience or thoughts on this. Yeah, I think I'm in agreement with Amy that, you know, it's, we're not, it's not our place to train the CROs, I think, ultimately as the sponsor companies. The mechanisms for collaborating and adapting training are already there. So our studios master the tidy versus courses are supplied as a creative commons share alike license. So there is absolutely nothing stopping our in pharma organization or someone like that, taking that training material as a basis, making it more tailored towards pharma, and then sharing that back on GitHub. So that then it's it's open source, you know, and I think if we had something like that, then we could be pointing CROs there and saying, you know, this core training will suit you and it will be in the right domain and it's not talking about penguins but rather it's talking about, you know, clinical data. So, you know, go there, you know, we could augment augment it with challenges or example data or, you know, something like this that they could practice on. And I think, yeah, that would be. But then, but then I'm imagining that when those individuals come in and work with sponsor companies, you could say, well, you know, have you got that basic threshold understanding of the nuts and bolts. Now let's tell us about Jansen's package GSK's package, whatever. Yeah, yeah, I think generalized training in a common platform and then if there is customized then it can be taken. So that also kind of avoids any kind of co-employment issues and procurement and things like that great. I think we are out of time so I don't want to further proceed we have few interesting questions but I think we can park that actually. So it's a great panel discussions and presentation. So Paolo you want to wind it down. Yes, yes. Thank you. I like first thank the panel to everyone for the panelists for coming and sharing your experiences. This is great. We love to have this collaboration and we are aiming to do that in a more open source way. That's why we're doing this. That's why we're doing this for packages for for validated for training for everything that we have to do as a community. And we really are excited with being able to share our experiences on training and looking forward to hear from you guys in your experiences on different areas of this journey that we have on using our open source in the pharmaceutical area. So thank you very much. Thank you for all the attendees for sticking with us. And until the next one in a couple of months. Thank you. Thank you. Bye bye.