 Let me share my screen. Welcome everyone. Shashi, should I go ahead and get started? Yes, please go ahead. Great, thank you. Hi everyone, good morning, good afternoon, good evening, depending on where you are. Welcome to the virtual team summit for the pilot phase teams in the digital learning challenge. My name is Devin Crotman. I think a few of you have met me either virtually or during the last team meeting that we did. So my role is sort of the program lead director, kind of oversee the day-to-day operations. So I'll kick it off and go through a few slides before I hand it over to my colleague Shashi and then Monique who will do the brunt of the answering of Q&A today. So first of all, Shashi, let's go back one slide. Yes, I'm sorry. OK, there we go. Yeah, so a big congratulations to each one of you for making it to this phase. I've been around X-Prize for almost seven years now. And if you make it past a judging round, you should give yourself a pat on the back. X-Prizes are uniquely difficult and challenging. And no matter how far you end up in the competition, making it this far is a puto to you. So you should bask in that glory and really take some accomplishment in that. So before we get started on the agenda, I just want to give a big shout out to our sponsor for this challenge, which is the Department of Education's Institute of Education Sciences. Their partnership, their sponsorship, and their leadership is really why we're here today and the genesis for why we're trying to modernize how we measure educational outcomes. So if not for them, we wouldn't be here today. And we want to thank them for incentivizing this competition and moving us forward. So today, we're going to go over the challenge timeline more specifically from now until end of Q2 and really focus on the pilot phase. Shashi will talk about the really fun insurance requirement, and then she'll go over some feedback, generalized feedback on the technical submission, and then Monique will deep dive into the pilot study phase itself, and we'll do some Q&A at the end. So we'll go to the next slide and then the next one. OK, so this is a revamped timeline. Really kind of honing in on from now until the end of the challenge, which is March 2023. I think what we want to do really today is just kind of focus on the first four rows, if you will. So today is a virtual team summit. We're going over the initiation into the pilot phase. We'll talk about the insurance submission deadline at the end of this month. And then when the pilot study submission will open in March and then when it will be closed in July and sort of ending this phase as well. So we're really just kind of focusing in on the next few months. The pilot phase will take us from now until the end of Q2, and then we'll go into the next demonstration phase, which will have our five finalist teams. So we'll go to the next slide. OK, so some housekeeping items on our end. For this, again, we want to congratulate you for making it to the pilot phase. We are not going to be doing a press release for this moment. We're going to save that later for when we have our finalists later in the year. But we would like to write a blog piece highlighting the 10 of you and posting this to our website sometime in late February or early March. So at your earliest convenience, we know you have a lot on your plate in terms of the insurance requirement and getting ready for the pilot phase. Please email over your team's URL if it's applicable, logo if it's applicable, your updated city and or location. In a short paragraph, kind of describing your team's background and its applicability to the challenge. And I know you sort of have provided this multiple times, but if there's a revamped version or something more streamlined that you want to provide, that'd be very helpful. We'll use that to post to our website and also include it in the blog, sort of showcasing your work and why you're one of the pilot phase teams. So I'll pause there and sort of transition it over to my colleague, Shashi, who will talk about team insurance requirements. Thank you, Devan. Okay, hello, everyone. I would be speaking to you about the insurance requirement and feedback received from the judges on your technical submission. In regards to the insurance requirement, there are three mandatory insurances that you need to procure. The first one is the liability insurance, which will protect your team against the claims for bodily injury or death or damage or to property, which may arise out of teams participation in the challenge with a minimum coverage of $250,000 per occurrence and $500,000 in aggregate. Teams also need to make sure that they procure endorsements from your insurer to all the policies obtained for XPRIZE Foundationing and Institute of Educational Sciences. As an additional insured with waivers of subrogation, workers' compensation insurance is the next one, which we need to take care of in case if you have employees in your team or organizations. Teams need to make sure that the insurance requirement is, as per the applicable laws of the nation, state, territory or province, having jurisdiction over your team's employees with limits sufficiently to cover the team's potential liability to its employees in connection with the team's participation in this challenge. As I mentioned earlier, in case if a team has no employees, then the workers' compensation insurance is not required. And in case if any team has volunteers working with them, then they need to procure volunteers' health and accident insurance, which would be sufficient to cover any kind of injuries that may occur during the timeline of this challenge. The last one, which is a mandatory insurance, is the automobile insurance. If your team owns, leases, or operates an automobile, in connection with the participation in this challenge, then team needs to procure the automobile insurance policy with limits sufficiently to cover the team's potential liability for bodily injury, property damage, and the property damage to third parties. These are the three which are mandatory insurance. Apart from that, we would recommend procuring umbrella liability insurance coverage with limits no less than about $2 million, but it is not a mandatory requirement. For full information on the insurance requirement, we would recommend, please check out the exhibit C of the competitors' agreement. Now, all the insurance policies that you procure needs to be from the insurer, which is rated by AM best, not less than A7 and above. In case of workers' compensation insurance, the insurance needs to be approved by the state and government approved programs. If the insurer is not rated by AM best, then evidence supporting the insurance financial strength needs to be provided, which would be subjected to approval by expires. All the insurance policy should be valid for the entire challenge timeline, and all insurance policies are due by February 28, 2022, 11 AM Pacific time. You need to submit your insurance certificate along with the compliance certificate form to digitallearningatexprice.org. We will be sharing with you a generalized feedback received from the judges on your technical submission to foster learning and to be clear of our expectations. The feedback are both for all the teams who were eliminated as well as the teams who are moving forward in the competition. Most of the team submission lack information on randomized control trial and replication plan of the experiment in both pilot study and the demonstration phases of the competition. Information on how the study outcome aligns with your principles were missing. Many teams did not explain the usability study and did not provide information on pre-registration of experiments and scaling of their solutions. We would like to inform you that IRB approval on your study is mandatory, and your solution needs to be a credible research platform. Teams need to add student consent to the platform whose data is being used for the study. Please make a note of all these requirements and make sure you incorporate all these details in your next submission. Thank you. I would like to request Monique to talk to you about the pilot study phase requirement. Hey, really quick before we hand it over to Monique, thank you, Shashi. A few things on the insurance requirements. We know they're very rooted in legalese. So if you have any questions about that, please let us know. We can route that to our legal team. We are not lawyers. The insurance requirements are applicable across all XPRIZ competitions no matter what they are. So we appreciate everyone's patience and we know a few teams have sent that in already. So thank you for that. In terms of the feedback, XPRIZ usually doesn't even provide, it really varies competition to competition in terms of providing feedback after each phase. But we felt talking to the judges that providing generalized feedback would help hone the solutions for the 10 teams going forward. So again, to Shashi's point, not all of this feedback will be applicable to you, but it was applicable to all the teams that submitted to technical submission. And it's just something to kind of weigh if it does apply to you as you go forward. So thank you. Monique, it's all yours. Good morning, good afternoon, good evening, everyone. First of all, I'd like to join my colleagues in congratulating you all for making it thus far. I'm really excited. I'm a math nerd. So I'm really anxious to get to know you all and the work that you're gonna be doing. So I wanna take a moment to reiterate and break down what the pilot phase will look like. There were a lot of questions and great questions I may add, but I want to make sure that we are all on the same page about what are the expectations in this phase. So overall, you need to conduct a one month minimum study in an education setting. There are details in the guidelines and rules and regulations about how we define that education setting. So please make sure that you are referring to those documents as they are your best guide for how to engage in our competition. So you'll conduct your one month long study, your pilot study, and you will also conduct a replication with at least one learner demographic. So your initial study, let's say it lasts for 30 days for ease of this example. So if I start my study on March 1st and I conclude on March 30th, I will have 30 days from March 30th to launch my replication study with a specific learner demographic. So in that pilot study, I might sample all the kids or may not have a sampling discriminant to pilot initially. But when I do my replication study, I might wanna focus on ELL learners or I wanna focus on students who are remote and think about possibly what it may take to conduct that work within the February 2022 to July 2022 timeframe. So you have a lot of time, but not a lot of time to really make sure that you have all your ducks in a row regarding IRB. You might need to conduct MOUs with your institutions. So we give enough time to make sure that you have all your ducks in a row to carry out these two studies. So when you complete your studies and it's time to submit your information in July, you will submit a technical report of your pilot. The judges and I are working on the guidelines to construct that report. I wouldn't think of it as like an in-depth technical report, but some explanation and justification and maybe an executive summary of what you've completed. The raw data generated by the study, reports of the data, and a set of analyses using the raw data. Again, more details on what this looks like will be coming out soon because the judges and I were gonna make sure that we have clear expectations and outlines of what we mean by this. And of course, you can always refer to the rules and regulations linked here, but we will also take advantage of the Slack channel to dump all of your resources there. So if you're not on Slack yet, please reach out to Shashi or digitallearning.expris.org so we can get you linked into the Slack channel. Next slide, please. Thank you. So more information about the pilot-based study. You know, again, we wanna make sure that it is conducted in a U.S. setting. It should reflect to the best extent possible the diversity of the students and characteristics of the American education system. If you are studying students in the public education system, be mindful of the representation of those students and perhaps if they attend Title I schools, what does that mean and how do those students navigate those systems? If you're interested in doing charter schools or private schools, also understanding the policies and how those policies may impact how a student navigates or learns in those environments, it's very critical to the work that you'll be doing. Next, you wanna demonstrate the ability of the platform to collect and analyze data. This is extremely crucial because this is what IES is looking for. They're looking for these organizations who are developing tools that have the capacity to not only collect and analyze data but also, you know, tell us about the outcomes that were achieved for those students and what factors led to those outcomes. The study should be pre-registered using an open-size platform. I'm going to tap into my network to find a professional to walk you through this process. How do I register my study on an open-size platform? What is the information that we need to provide that information because this is a requirement of the study. So I wanna make sure that you all have to know how to do so. And primary outcome measures should include student outcomes sensitive to the performance change. On the next slide, we'll talk about what exactly those required data points will be, but I just wanna point out that there are specified required data points that you have to submit in July. And then finally, consistent with the standards for excellence in education research or the CIRA principles, I'm going to also tap an expert to come in and talk to you about the CIRA principle so that when you are writing up your reports or conducting your study, you can speak to these principles and demonstrate your knowledge and expect, and demonstrate that you can meet the expectations of this aspect of the competition. I see we have a Caleb. I'm an ambassador for the Center for Open Science and we'll be happy to serve as a resource. Awesome, so we can use the resources within our own community to do that, phenomenal. So Caleb, I will be in touch with you. Let me take down your name. I'll be in touch with you so we can chat offline about what that looks like and how we can get you connected as a resource. Appreciate it. Next slide. And I'll get to your Q&A questions at the end, but please drop them in the Q&A function. So here are the pilot phase minimum data requirements. So the study must collect the following data points. Individual student identification. Obviously you should make sure that that is a not or confidential or anonymous. You don't want to share any student names with us or any student personal identity information. If you can, well, you should please apply a case ID or some unique identifier to the students. Any demographic data and information on student characteristics that impact the education outcomes. If you're collecting information on student race, you have to have a justifiable reason as to why you're collecting that information and why that specific characteristic is important to the educational outcomes. Not just to, hey, we're studying differences between black and white students, but what is the scientific justification and reasoning for that? The student baseline measure. So that could be standardized test scores. That could be math reading scores. It can be performance on a specific quiz. So whatever you are trying to measure or track, make sure that you have your baseline and your post-study outcomes. The baseline and outcomes measures, let's be measures using the same units, of course, consistency and making sure that we are operating on the same scale between your baseline and outcome measures is important. Student attrition, this happens. We are living in unprecedented times, but we also know that people are constantly moving in and through and out of systems. So can you go back, please? There you go. So just make sure that you are tracking that. If you enroll 400 students in your study and 50% drop, or for some reason, the school closes or goes to remote, make sure that you track that and how that impacts the engagement in your study. And then process data, describing how students interact with educational materials and activities. This may be the data that you're getting through your platforms. Ideally, this would be the information that you're getting through your platform. So if they are demonstrating proficiency across a certain number of learning paths or activities, you want to track that. If they are maybe not so consistent in their performance, you want to track that too. It may be other behaviors such as, how long are they spending time doing certain activity? So whatever your platform is designed to do, this is kind of where it shines in this kind of way of showing us how you're collecting data, how you're analyzing and how you're interpreting the data. And we will work with our IT team to make sure that there is a portal for you to submit all of your pilot phase performance information to our POP website. Next slide, please. Great. As our colleague, Shashi mentioned early before, IRB approval is required. We're working with vulnerable populations, students, minors and possibly other special populations. So we want to make sure that we have an understanding that IRB review is required. So for certain teams, you may think that full review is not necessary understood. You might do expedited or exempt. We just want to make sure that there's an independent, accredited IRB agency or academic institution who is reviewing your research proposals, who is asking the important questions that need to be asked, requiring the important documentation that is required, and that you are getting documentation to protect the work that you're doing and protect the students who are engaged in your studies. So we would ask that you submit that documentation by June 1st, and you'll send that into our digital learning website. If you have any questions or if you're not sure about how to go about it or you need recommendations on independent IRB agencies, we'll be happy to field those questions. So this is a great opportunity for you to utilize Slack and kind of share the agencies that you might be using. This is a community of learning, but also a competition. So as best as you put your foot forward, I think it will make everybody else better. So it'll be really great if we can kind of take that perspective for the next year in this competition. Next slide. I think this is it. Okay, great. We're in the Q&A section of the presentation. So I'm gonna jump right in and read Kristen's cute question. Can the replication study overlap with the pilot study or occur simultaneously? I would say, I would say it depends, right? So how do you define occur simultaneously? I would say you need to wrap up the pilot before you start the replication. Do you need to take the full 30-day window? No, you can kind of start that immediately. If you wanna run multiple replications, those can overlap simultaneously. But I think that the goal of this competition was to demonstrate that we have reproducible studies that can occur within 60 days. So we really wanna test that assumption. So keeping that 30-day window is important to making sure that our assumptions can be analyzed across the each team, if that makes sense. And Devin, feel free to jump in if there's anything I missed. But that's essentially the goal of this competition. Can we create reproducible rapid studies in a 60-day window? Anonymous says, one small challenge we would... Yeah, no problem. Thank you, Kristen. One small challenge we were running into in the technical submission phase is that school closed May, June and where we're working. And the last month is reserved for exam. So we only have February, April. I'm curious how others are handling this or running into this challenge. Great point. So this is information that we can take back to our sponsor and the judges and determine how we wanna possibly navigate some of this timeline issues. So I'll take that information and share it back with my colleagues and we will get back to you. If you can send an email, I know you're anonymous and that's fine but if you can send us an email with your plans and some of the issues that you think you may run into, please do so. Yeah, and to build off Monique's point, I think this is really interesting to see how other teams might be approaching this. So encouraging you to use the Slack channel to kind of share these ideas as much as you feel comfortable. Again, it is a competition but as much as you all feel comfortable sharing these kind of collaboration ideas that would be helpful. Okay. So I think we should go back to the judges and our sponsor because I think that if the concurrency thing doesn't work out, what are the options in terms of making sure that we have enough time to get these studies in? So let us go back to our colleagues and make sure that we have an answer for you. Thank you and please send us an email. If you are impacted by that, please send us an email so we can go and say, we have a number of teams who have questions about this. How can we support them? Okay, great. Thank you, Lauren. So back to the concurrent thing. And I think Monique, Sashi and I can talk a little bit offline with the sponsor. So if we're not, you know, would we need to build in an extra month or two in order to get to that demonstration phase and possibly, okay. We can take a look at the timeline and have that conversation. I'm just wondering, yeah. I think it's more so about the school and their timeline, right? More than ours. So if schools are ending in May and early June, then we will, I mean, we pick these teams in the areas that they wanna work in. So we should probably just have a conversation about, you know, can they be concurrent? Right. And how much time should pass before our replication, you know, starts? So can you start your pilot? And then two weeks later, you know, you're still running a month on pilot, but, you know, maybe two weeks pass and you can initiate your replication. I'm not sure, but we'll see. Yeah, because I don't even know if this isn't, like to Ben's point, I just, I don't even know if this is an extension question, because if we extend, I mean, schools don't pick up again until September, right? And that doesn't, that's a little bit too long within this timeline. So. Yeah, so, yeah, we will check in with, I think you should consider April. Okay. Okay. Thank you all for your feedback. We will take that to our sponsor and colleagues and we should have an answer for you pretty soon because as you know, we'll be, we are initiating this phase right now. Are there any other questions? I got the questions on the timeline. Oh, I did want to mention, Caleb, I will be reaching, can you please reach out and send an email to digital learning to talk about SIR principles or open study, I believe it was open study. Yes, office, office hours, sorry, I said guest speakers, office hours. I will be sending through the Slack channel my monthly availability, just say, hey, everyone is there at this time, I'm available for drop-ins if you want to chat, just send me a message on Slack and I'll send out the Zoom link to provide one-on-one or group check-ins and support so that we're just making sure that we are connected. And if there any bumps or hiccups along the way arise, that is your opportunity to kind of just drop in and just give an update. Best form of communication moving forward is Slack for anything for the good of the group, things that will make all of us better, things that would be good to share ideas on IRB agencies or if you're looking for particular software, whatever, you can feel free to ask the group about that. Emails for meeting scheduling, anything concerning your group and your experience in the competition, Shashi is your community lead, so she will be sending out information. So make sure as you're a team lead, you're sharing that information with your entire group so that they are informed on the updates and changes to the competition. And I conclude. Yeah, so we're right about half hour mark. Oh, I'm sorry. There was one more question from Christina. For the demonstration phase, the only difference is that you'd have to replicate, I believe it's five replications across three different learner demographics. So use the pilot phase, the minimum 30-day window or maximum 30-day window to launch your replications and you'll have to do it five times for at least three groups. And that will be concluding in March of 2023. So their minimum applies to the pilot length, the maximum applies to the duration between the pilot conclusion and the launch of the replication. The maximum is 30 days to launch the replication. The minimum pilot length is 30 days. Okay, Ben said, can you share other examples about demographic and student characteristic variables that might be included? I agree that race is potentially problematic. Other examples that might be better. Not that race is problematic, but it has to be tied to a theory. It has to be tied to some type of need related to the interventions that you're carrying out. But I would say other characteristics could include reading level. It can include if they are a student who is participating in a choice program as opposed to attending like their neighborhood schools, it could also be students who are considered gifted or students who are athletes. So there's tons of ways that you can slice and dice or think about how you wanna characterize your samples to conduct your RCT or QEDs. But whatever you decide, we wanna make sure that it is rooted in some basis that you're trying to address an outcome that students who hold these identities or characteristics will benefit from an XYZ way. And again, we can use our weekly office hours or we can just chat offline about this. I'm happy to review anything that you put forth before the IRB just so that we can just keep transparency amongst all of us and we can support you. Well, I think there's a question in the chat that just popped up. Yes, but what we mean a minimum of 30 days is from the time where you collect your baseline and you collect your outcomes. So there might be some activities going on after you collect your baseline and outcomes data, but that's the window of time that we're looking at baseline to outcomes, baseline to outcomes, that should be 30 days minimum. Yes, between pre and post-tests, if that's how you're framing, it would be, yep, your baseline and your outcomes. And it could be 60 days too, like again, a minimum of 30 days. So just wanna make that clear that your study can run as long as you need to, but make sure that you have enough time to replicate and conduct all the other activities to turn in your work on time. Okay, one more question. Our project, okay, I think it's two more questions. Oh, yes, statistical, yes. So power analysis, there was a question that came up about do we need to conduct this for both the experiment and the replication or combined? The answer is separate. So you wanna treat each of these studies as unique independent, but obviously you are replicating a pilot. So the short answer is yes, conduct your sample and power analysis independently of each other. So analysis for pilot, analysis for replication. And one more, regarding pre-registration, our project is about crowdsourcing teachers as research should develop their own project and then replicating with other teachers. What does pre-registration mean in this case? So I believe the pre-registration is about transparency. It's about sharing your ideas in a, what's it called, like a digital space where other researchers are trying to either replicate or see kind of what's the trends in the research around certain topics. So that's what it means to register your study so that people can read it. People can critique it. People can provide questions or anything like that. I hope that answered the question, but it really is just more so like, IES wanting transparency and wanting other researchers to know what's being conducted so that if they want to replicate, there's a resource to do so. All right, so how did I do? How do we do? Okay, so it says there's another question coming in. All right, no problem, Ben. I'm unable to see the questions are being sent in the chat. Okay, yeah. Yeah, so I mean, we'll send out an entire Zoom recording of this conversation so everyone can re-watch it and answer all the questions that Monique answered. So hopefully that helps with everything. Phenomenal. Thank you all so much. You've answered amazing questions. You've done amazing work thus far. We cannot wait to see the amazing work that you will do. Feel free to reach out to us via Slack or email if you need anything. We will be sending the recording and they'll be like a little visual. I'll be sharing that with you about kind of how to visualize or make sense of this funky timeline that we're working on. But we will get back to you on whether or not the sponsor is willing to consider making changes to the timeline to support you and also in recognition of school's timelines. And can you share the list of the 10 teams with us? How do you feel about that? I think we can do that. So we can send that out later today or tomorrow. And then we just ask that, you don't publicly make an announcement of this yet. We wanna try to get this blog up and running, but we're certainly happy to share the names of your peers that are moving forward. So your to-do list teams, as Dev had mentioned, is to send over that information to us so that we can write that really nice blog piece about you and the work that you're doing. Shashi is asking for you to submit your insurance documentations, and I am asking you to take a look at your rules and regulations, take a look at your own proposals, and if there are any questions that you have, utilize that Slack channel. Great. Thank you all for joining us. Thank you. Thank you, everyone. Thank you, Monique. Thank you. Thank you, team. Thanks. Thanks.