 Hi everyone. Thanks so much for being here today. We're really excited to present this project that we've done over the past few months. So just to introduce ourselves really briefly, my name is Natalie Berkman. I'm instructional design manager at ASAC Business School. We're one of the top rated business schools in France and we've got quite a lot of students who are all required to take a few entirely online courses. So that'll be the topic of our talk today. And I'd love to present my team, my brilliant and accomplished team. So first we've got Nadia, she's our resident Moodle specialist. We've got Marion, the driving force behind the pedagogical strategy we took with these courses and Erwan, our data enthusiast and specialist. So we're all going to present first of all what is a SPOC and what are the SPOC specifically at our school at ASAC? Number two, what was our new design? What was our new approach? We decided to adopt a learner-centric model and apply it to Moodle in a really nice sort of welcoming sort of design. Then Nadia will talk about our assessment strategy and how we implemented it technically speaking in Moodle. And finally Erwan will talk about the data tracking and how we've tried to ensure student success in these entirely asynchronous and online courses through Moodle analytics. So without further ado, I'm sure many of you have heard of MOOCs and probably heard of SPOCs, but just so you know SPOC is an acronym, MOOC is a massive open online course. A SPOC is a small private online course. So the main distinction is that it's an online course kind of like a MOOC that's reserved for a specific community. So they're cohort-based, they combine digital elements like video and interactive peer-peer exercises with discussions and lectures which can be live or recorded or even a mix of both. So at ASAC, we have five SPOCs. We decided we picked the topic for these SPOCs based on our key points in the school's general strategy. So these are topics that we believe that all students pursuing a business degree today should have exposure to. We have so the five SPOCs are AI, very important nowadays. I think responsible leadership, the fundamentals of entrepreneurship, diversity and inclusion in the workplace and companies and climate change. We give these to every single student in our three main pre-experienced programs. So the Global BBA, the Grande École, which is our master in management, our flagship program, and our specialized master's program. So for a grand total of 6,000 students. So obviously it's a bit of a challenge. And that's about it. So I'm going to pass it over to Marion who's going to explain what our learning-centric model is and how we found a design to fit it. Yes, so we created a learner-centric model based on these three aspects. So first we defined a pedagogical framework using, of course, Bloom's taxonomy. And we also decided to use things, significant learning models because we want to focus on transmission and impact on the students. And we also used a skills-based approach because we want these courses to be useful to the students for their careers and for their daily life. And from this pedagogical framework, we defined an assessment strategy. So Nadja, we'll speak about it later, but very briefly, we decided to focus more on formative assessment and self-assessment rather than summative assessment, which is necessary for the institution. But we decided to focus more on the first two aspects. And finally, the model design, because at the end of the day, it's all about design. So we created a visual identity for each spoke, but an harmonized structure. So all the five spokes adopt the same structure so the students are not lost, they know where to find which information. And we also created an explicit onboarding and workload in the onboarding style that I will show you just after this. And finally, we decided to create more interactive content using mostly H5P interactive books. So this is an example. This is our AI for business spoke, which we are particularly proud of. So here you can see on the left the current visual identity. So we choose images that are coherence with a theme of a spoke. And on the first tile, which is called onboarding, you can find a presentation that you can see on the right. It's made with Genially. And it's made like a book so the students can choose to navigate it in a linear way, but they can also choose to click on what they're interested in. So they can choose to click on course structure and peer evaluation and go back and always made to make the students free of an actor of his early learning. And then you can see an example of what you can see inside of the tile. So if you, for example, if you click on the phase one tile, you can see the content. So it was made with bootstrap. There are bootstrap cards. And if you click on it, you can access to H5P interactive books because our content is mostly made of videos and readings. And it's kind of a passive activity. So we didn't have the time to remade all in a short notice. So we decided to focus on design and created H5P interactive books to make it more user friendly for students. In the original box, our strategy was based on exclusively summative assessment activities and with automatic correction, basically multiple choices and workshops. While we're very aware that our teachers can't grade 6,000 essays, we tried to adapt the strategy to, with the concept of mastery learning, significant learning and then competency-based learning. And our assessment strategy makes now both use of formative assessment and summative assessment. Our formative assessment strategy was broken into two components, formative multiple choice questions and self-assessment questions. And self-assessment questions, yes. We even invented a little character called EvaluBot to help students go through assessments, go through assessment activities, it's nature if it's formative or summative and how to approach it. We used H5P for the multiple choice questions for two reasons. The one reason with the advent of chat GPT, we have that browser extension called chat GPT for Moodle which allows students to get all the answers with all the quizzes native in Moodle and that extension doesn't work with the H5P. And the second reason was the fact that we wanted to personalize the quizzes to make it fit into the visual identities of the spots. We also used the self-assessment to build the learner's self-scale through these courses and we used an external tool called feedback fruits. And the formative assessment and the self-assessment are a part of an engagement grade. And for the summative assessment, which is peer assessment in our case, we also used the external tool called feedback fruits because it has a user-friendly interface and a more powerful analytical features than the native tool in Moodle which is workshop. And this graph represents our proposal for the engagement grade breakdown for the teachers and every professor can customize it however they want. And now Erwin will talk to you about data. Thanks Nadia. So finally to better ensure success of students in the in the Spocks, we have chosen to use data tracking in three different ways as you can see. So it's also important to keep in mind that it's also important to keep in mind for most of our students, these Spocks are the first experience of a fully insecure news online courses. It is also the first contact with Moodle at a second. So first completion tracking and re-engagement, we have chosen to put in place completion tracking to allow students to visualize their progress in the course. And we also, additionally, we have set up automated remainders to using re-engagement and also to reduce the workload of our pedagogical assistance. Second, we use Moodle analytics to identify students at risk. So I hope you had the chance to see the presentation of Stefan last Tuesday because I won't delve into the details. But thanks to our collaboration with Stefan, with our DevOps team, Rojji and Melvin and also with the creator of Moodle analytics, we have conducted several beta tests. Practically speaking, we send an email to a randomly selected group of students at risk identified by the algorithm. And these students succeed in a greater number than those in the control group who did not receive the email. So we have a delta of 11 percent between the treated group and the control group. So this year, we will send multiple emails on a weekly basis to students depending on their engagement with the course. And the third point, so analytics to identify outliers. This year, as Nadia said, we replace all the workshop activity by the peer evaluation tool, feedback foot. So why do we make this change? The reason is because they have developed an analytic dashboard that can be used to identify outliers and even with a large court of students in our course. This should allow us to understand which students took the assignment seriously for the essay and feedback to their peers, a task that was quite difficult in the past. So you see our journey to rethinking assessment and students' engagement in our five-spots for these new academic years. If you would like any further details, feel free to contact us at kelab.essec.edu. Thanks, everyone. And if you have any questions, I think we have time. We're glad to answer them. Thank you, everyone. Thanks, everyone. Hi. Thanks very much for that. I'd wondered if you'd made any use of H5P's essay question activity, which is where the activity will give feedback on an essay question by identifying keywords that the student must type in. And I wondered if that's something that you might want to look at for, again, more autonomous approach to essay feedback. I appreciate it's not the same as a teacher grading an essay, but if you're looking for facts and keywords, it's a really great way of highlighting that and providing a model answer as well. Thank you so much for that question. That sounds like a really interesting tool. We haven't considered it because feedback first actually also includes an automatic feedback tool. So we've been toying around with the idea. We've actually just started these courses now, so we haven't had the first peer assessment activity yet. But we've been toying with the idea of using a feedback fruits automatic feedback tool where you can, I think in a similar vein, say, I want students to use certain keywords, you can put in information about a minimum or maximum number of words. And that way, you've got much more information on the analytical dashboard as well. But the H5P, I mean, we love it. We use it for all the designs. So yeah, maybe we should take a look at that too.