 So, hello everyone. I'm Martin from the Nido City University, a very small university in the north of Japan, and I'm also representing the Moodle Association of Japan, or I'm an executive member in charge of what we call our showcase, which is something that I'll be talking about today. And my name is Sila Boss. I work as an instructor designer in the Tallinn University of Technology. Okay, so let's go. So, a brief outline of our presentation. We'll start with some general considerations in assessing courseware. I have one framework called the Assessment Purpose Triangle that we'll take a look at. Then Sila will present the Estonium System E-Course Quality Label. I'll talk about our Moodle Association of Japan Open Courseware Showcase, and then we'll have a little bit of a discussion comparing the two models, and then end with some questions and comments from the audience. Okay, so I'd like to start by inviting all of you to just think about what makes a good Moodle course, or what are some qualities of a Moodle course that would make it a great Moodle course for you. If you have anything, something just easy to pass. Just shout it out. We won't have time to. Accessibility. I've heard that. That's a level up. My colleague, Kodri, already said that she made a little introduction, so I will continue now. Any other, I guess you've already done this then. I thought we could take a look at this model by Archer, who proposes an Assessment Purpose Triangle. She talks about priorities for evaluating courseware, that balancing, finding a balance between three sides of the triangle. She says there's assessment for learning, which I'm calling a pedagogical things. These are things about the course content, the learning material. There's assessment for accountability, which are ethical or communal considerations, and assessment for certification, progress and transfer, which would be more institutional things. I found this model and I thought that it's a very good way to balance the priorities that depend on your own situation, because there are so many reasons that you might wish to assess. We use rubrics. The nice thing about rubrics is that they allow us to describe multiple assessment purposes or criteria and quantify them by assigning data points or numerical values. Rubrics can have an instructional function for content creators, provide transparency and can be both holistic and analytic. Another thing to consider is who does the assessing. It's important to give all the shareholders with vested interests an equitable but not necessarily equal voice. Who you would engage to do the assessing of courseware is something to consider. We're both very nervous, so pardon the confusing. Thank you for the encouragement. As far as we know, so far we've just found two models, two systems for assessing little courseware. We have this one in Japan, and Silly will now talk to us about the Estonian course quality label. That's how the different logos look. I want to speak here, then I can see you. Can you hear me now? Yeah, good. I will shortly give you an overview of how we assess the courses. Estonian e-course quality label has a aim, and the aim is to increase the quality of e-courses in higher education, vocational training, and general education. That means everyone from every level can submit their course, and we will evaluate them equally. Even if you have 100% e-course, maybe doctor studies or something like this, and there's a half e-course in high school, we evaluate them with the same metrics. That's how the logos look like. The course authors can use them to promote their course, and also we add the year because the quality label doesn't mean that your course is excellent forever. You have to update that, and that motivates the course authors to renew the label. Our matrix is based on AdiModel. I'm going to show it to you. Our course authors have to use online self-evaluation, and there are 29 questions that they have to answer, and all the questions are based on the AdiModel. Was that understandable? I can just briefly tell that to the people who didn't understand. First, the analysis part is to self-evaluate the course. There are questions about course outcomes and syllabus. The designing part has questions about planning, the learning and teaching process. The development part is having questions about the e-course structure, creative governance, audio-visual mediums, implementing. There we check how we see the course in real life, how the students interact with each other, how the course actually works, and the evaluation part is about what the teacher does or the course author does after the course has ended, the semester has ended, how they use the feedback, how they grow their course. Here you can see that submitted courses per year have risen every year, and this year we had the highest number, and that also meant that we had the highest number of charges. I forgot to explain what's on the chart. The blue represents the submitted courses, and yellow means how many courses got the label, so not every course that submitted gets the label. Here the green part means that how many charges we had per year. This year we had actually more than 74 charges, because we had so many new courses. Yellow means how many new charges came, because every year we have more and more courses, that means we have to invite more and more charges to come to take part of it. The judging process is actually quite long. It starts with the seminar for the new charges, because they don't know how to do their work, and then we explain and show them, and then the grouping will also be decided that every group, there is one experience, an experienced judge who has done the job almost like 10 years, and there's new one who has never done it, so the new ones will always learn by the expert. We always sign the confidentiality agreements, because we will get access to the real models, otherwise you will not know how the plugins work, and our big universities are developing our own models, so our models also look differently, so if that gives a better overview of how the course actually works, and you will also see real life actions, hopefully, and how many teachers are there, and forum posts and everything. After that, we have group feedback assessment, when we in a group decide how many points we will give to the course. Every step, we are using the same matrix, what I showed you, with the 29 questions, so it's not that it has different group matrix, it's always the same, we are always asking the same questions. And after that, there is like the final seminar, where 10 best courses will show, they will be presented, and also the annual winner will be voted. And after that, we will show, we will send out anonymous feedback to the course author. And now back to Martin. Briefly, let's take a quick look at what we have in Japan, we call it the showcase, the Moodle Association of Japan, we have a Moodle, which with some customized plugins, and people share courses, and this is what the site looks like. We share only courses, and students, or teachers, content creators will drop their MBZ file into the window, and submit them for assessment. Now going back to the purposes of assessment triangle, assessment for learning, so these are our goals in the assessment of the showcase, we strive for excellence of learning content, of course, so we're looking for well designed and varied activities that incorporate multimedia and creative plugins, lots of interactivity, timely feedback, and an appropriate amount, or an extensiveness of content. Assessment for accountability, we are the Moodle Association of Japan. And so, you know, we try to serve the Moodle community in Japan, the Japanese community of Moodle users. So in that sense, we value courseware in Japanese. And this is a situation very common in Japan, we have a lot of active language teachers in Japan, so a lot of the courses are language courses in English, and we're trying to get more and more Japanese Moodlers to contribute courseware. We seek to promote a wide range of content and topics, we value courseware that provides fair and transparent assessment, okay. And we try to encourage teachers to create original content and share it, so we're looking for open education resources. So I think that's probably our strongest priority. So our balance is of these in this triangle is more towards, I think, accountability. In terms of the last one, for certification, progress, and transfer, we want to provide an opportunity for advancement. So, you know, we're looking for, we want to help teachers build their own portfolios for, you know, in their resume, or, you know, their career. We attempt also, I think, to posit the Middle Association of Japan as a body of authority of, you know, with, we're Moodle experts, we have a certain amount of Moodle expertise, and perhaps even these goals are perhaps a little bit illusory, but create a Japan-wide standard of quality in e-courses. This is just a quick look at the assessment rubric, and we have five categories of assessment, these variety of activities and content, and each we have a scale, a range of points that we can give for different levels of accomplishment in each of these five categories. And we also allow for some holistic. This is the rubric that we've been using since 2021. And so at the bottom here, the plus is we have judges will put in other considerations or comments looking at the course, its context of use. And that can change the total score. As for the judging process, we have an annual Moodle moot. We just have to get somewhere around 10 courses submitted, sometimes a little more or less. We have a panel of about 10 judges, and we set a submission deadline. So it's usually about a week before the moot. Each judge gets three, four courses. Someone compiles, these days it's me compiles the results. Judges also will post feedback on the showcase, the website or the Moodle site that I showed before. We have a judge meeting when the mood starts, and then an award ceremony. And you get certificates and prizes. And so the lucky winner, we have first, second, and on third, or honorable mention, awards, we'll get a certificate. And this wonderful keep calm middle on coffee mug. Okay. So, um, yeah, if we have two models, two different systems that we've, you know, rushed through presented. So comparing them, what we think of as strengths and Sheila and I will just kind of have a conversation. So some of the differences and similarities of Moodle courseware, for example, we're concerned in Japan, it's only Moodle. We are the Moodle Association of Japan. So of course, our goal is to promote Moodle. And we want Moodle courseware. How about you, Sonia? We use Moodle by accident. Right now it's supported by the government. So we, big universities have their own Moodle. So we develop develop it our own. But schools hire high secondary schools and like, they use Moodle because it has a nice tests and they can do complicated things that they couldn't do in Google Glass classroom. And it's it's funded by by government. So the majority of your courses are Moodle courses. But yeah, it's by by accident. Because and also like, what we have bigger, like we have vocational schools and some unlike secondary schools, they they can send their courses. But it's only like private schools also send only one one course per per per institution. But big universities, we compete. So half of the courses actually come from our university. And half come from Darto. And so we we are the bigger competition competition. We have bigger competition. But all the rest like right now this year, we had like 16 different institution center courses. It doesn't like like, they are from Moodle. But it's it's still like, it's not the main like everyone can send. We have wiki pages being sent and like every every LMS, it doesn't like, it's not Moodle centered actually, by by accident. It is like that. In in a Sony, you also have a self evaluation process. So you begin with you get student feedback. And then the person the content creators also evaluate their own courses before they submit. Yeah, the bureaucracy is quite, quite long. And the course author has to use the same matrix for self evaluation. And also not your your teachers, professors, they don't have to do anything. They can just upload the course and it's done. But our professors, they actually have to do a lot. And sometimes they can't send their course only because of the bureaucracy. Maybe someone in the institution doesn't give their student feedback. And then they they can send it. I had one course that was not sent last year, only because of that. But yeah, we don't have a self evaluation process in Japan. We want people to submit. It's it's we want people to share. So we kind of lower the hurdles. And if you got something, come on, give it to us. You have you have access to the original course e course site. Yes, always. Always. In our case, the courses are actually uploaded. So the whole course is on a Moodle site that we call the showcase. And if occasionally, of course, someone who submits a course will give a link to the site where they actually use it. But we have the ability on the showcase to see the whole course and to see how it works and to experience it as well. And shows that it does work. Yeah, we if they have additional plugins and stuff, we attempt to keep the showcase updated with any plugin that's required for that course. Yeah, I like that you can download and you get you can see all the technical information. That's that's one that was one of the favorite things that I discovered in your rubric. We also have a versioning feature. So when someone submits a course, they do provide a lot of metadata. They tell us what Moodle version is running on what it is on the original site. And we have information about the plugins that someone who wants to download the course would would need to install on the site. Yeah, we just have the access and we can see the course exactly like a teacher or professors is course author, we get the same, same roles in Moodle. One thing that you see that you told me was that when you notify people who are wanting to have their courses assessed, you offer support, preliminary support or preparation support before this event. Yes, we we start promoting already as soon as we can because we want our professors to get the label and we want to help them out. And they can turn to us and ask help. And that's why we are working there. And anyway, I'm doing it. That's my job to help them with the courses. For us, it's always a bit of a task to get people to submit courses. And occasionally, someone who's thinking about it will ask for advice and want some help. In Estonia, as well, your judges are anonymous. Yes, this is something that I had never thought about, but your rubric is like this, that I was able to see courses that are maybe three years old and what people were analyzing and telling. I could not get that information. I am doing my own work in my own group and I will not see what the other courses are, what feedback the other courses are getting. I actually have to ask my professors for them to send me the feedback. And even then, I don't know who gave that to them. So it's totally anonymous. And your your your rubric is good because I can see what let's say Adam told you three years ago. I think it's fun. And bonus points you have as well for sort of a holistic thing or interesting. Yeah, even though we have 29 questions, we still have this part for bonus points, because let's say someone has done something so innovative that no one has done it before. And there's no question where I can put the point, then I can still put bonus points and explain why I did that. One other difference is in Japan, our judging process is quite short. It's just about one week really from the time of submitting to the award ceremony. Yours is quite long. Yeah, like I showed you the process. It takes it takes lots of time, lots of teamwork and lots of agreement, finding the consensus. And we found I showed a picture of our certifications, our certificates and the mug cup. That's about the extent of our prizes. We want people to to share. And it's so very familiar. But you have quite a large cash prize for first place. That makes them motivated to take part of this. And they are motivated. Okay, we have five minutes. Anyway, it's a brief introduction of two systems. And what would how would yours be different? Any different? How our priorities are different? What would your priorities be? These assessment models are works in progress, like the courses themselves, you know, they continue to develop. Yeah, that's where we are. Okay. Thank you. Okay, yes, we have some time for questions. Actually, I just wanted to say the the Estonian system is quite new to me, but I did have the honor of being one of the judges in the Japanese system at a Japanese Moodle Moot in 2016. So that wasn't new to me. Yeah, we have a question over there. So I'll leave you to it. Hi, we use quality matters as our course judgment system, rubric thing. But do you the ones that you have, do they have any Moodle specific questions? Or is it in Japan? Or is it still mostly platform agnostic? Sorry, it's still mostly what? So does it do you have any Moodle specific questions? Like is it they use this tool? Well, did they use forums? Did they use quiz? Or is it still very much more about the pedagogical strategies in the course? Yeah, that's one thing that we didn't get to meet. We of course look at the content. But most of the judges are past winners of the course of the court, the showcase. And because most of the contributions are by language teachers, you know, like we get a physics course or something. I don't know anything about physics. You know, so, so, yeah, it's we're looking for Moodle these things, you know, Moodle is the priority. But yeah, so the rubric, this is a this is a complete rubric that has been completed. And you can see that the third category they gave quite a low score only for because it's it was a language course. So it's not terribly highly valued because we get a lot of that. Yeah, I know it's it's not readable. Yeah. Anymore. Yeah, one time for one more. Yes. Hi, so I think this is probably a question that applies to both. But what status does your judgment have back in the contributing institution? And what happens is I've only got one question. I'll make it a two parter. And what happens if there's a conflict between your judgment and what the organization contributing is requiring? Are you referring to both our systems or? Yeah, both and I've a yes. I in Japan, I don't think we really have that problem. Teachers submit on an individual basis. They're not particularly representing their university or anything. Yeah, I don't think we've had any. Would it be prestigious for them to go back in their institution and say I put it forward as part of their professional portfolio that they had achieved a certain judgment from you? Or is it just? Yes, yes, right. Yeah, it's it's it's a valued certificate that teachers do use. Yeah, so the institution's value your your judgment, but you don't have any conflict. Yeah, yeah, yeah. Yeah, good. The Estonian system. We have this, like I showed you before the process. It is like this that three charges will charge three courses. And if they don't get consensus for some reason, they're still like they don't find the consensus. Then a mentor like like one really, really expert will help them out. So it might happen. And they would put that forward within their school as something prestigious for them, would they? Your judgments would be seen as a positive thing back in their institution. The institution doesn't see the feedback. Right. Only the course author, because they have to log into the online system and there they will see it. Yeah. The other one, but no one will see it. But would the course author take it to, you know, their manager and say, look, this is a good course? We can continue later because we're out of time. Yeah, yeah, sure. I'm happy to discuss on that topic.