 Welcome everyone to this new event that Eden Network of Academics and Professionals is offering to its membership, of course within the Eden Network. Today we're going to have a very interesting, as usual, very interesting webinar on a very relevant topic OER's assessment and evaluation. It is a topic that our host today, Francesca Mendoni, is dealing with within a project, the Open VM project coordinated by Beaud University in Berlin. A European project, the Nerasmus Plus project, but you will know everything later thanks to Francesca's presentation. I just wanted to give you some information about what Eden Network is and what it represents and offers to its membership. Of course it is a community for professional development within the Eden Association and it supports networking for individual members. It supports of course meeting and communication forums like the one that we are attending today. Of course it is coordinated by a steering committee which I chair. It helps information, it helps building up personal portfolio, it promotes communication, it helps in finding partners within the membership in order to promote new research, new international projects. So please try to participate in our events in supporting our community. This is the steering committee so that you can add faces to the names Alfredo Saero, El Gatorner, Wendy Chowy, Don Alcott, Alspeth Corgan Sorensen, Alastair Krillman, all people that you surely know, but that you can contact knowing also the role within the network. This is a slide that introduces our NAP members area within the Eden website. Here you have a list of the services that you can access to as NAP members. I already mentioned different ones, but here you have all the details. I don't want to steal time to our webinar. We try, as you can see from our program, especially this term program, to involve to support the participation of what's mentioned before through different media, different channels. And so the possibility to attend webinars, the possibility to be part of Eden Chat as well. And I remind you that we will have an Eden Chat, which is a follow-up activity related to this webinar later on at 6 p.m., so don't miss it, are all ways of being part of this community. The last slide is to tell you about our next face-to-face event, our annual conference that will take place in Bruges in June from 16th to the 19th of June. It will be a great event as usual, and you will have the opportunity to attend one of our workshops devoted to, especially organized by the NAP, by the Network of Academics and Professionals, where a sort of speed-dating event will be organized, and different groups, different research groups, different research ideas can mix up and start new projects, new groups, new interactive opportunities. So I'll wait for all of you there in Bruges in June. And now the floor is to you, Francesca. Francesca Menduni, she will tell you about her background, but she is a psychologist, and she is now attending a PhD course in experimental pedagogy at the University of Roma III and the University of Bruges. So please, Francesca, the floor to you. Thank you for your work. Thank you. Thank you so much for the introduction. And today I will present a webinar named Assessing E.R.'s Quality, the Open with one Minimoog case. And as I said before, I'm a PhD student at the University of Roma III, and I am involved in this Erasmus Plus project, coordinated by both universities, and the Roma III coordinator is Antonella Portia. This is the agenda of this presentation. And first of all, I will try to explain why it is important to think about open educational resources quality assessment. Then we will see framework for E.R.'s assessment, and now we use them in the Open with one Erasmus Plus project. Finally, we will see together the preliminary results of the pre-pilot phase, and how these results are integrated in a design-based research perspective. Teaching and learning mediated by the use of technologies require a deep reflection on content features that support learning processes. In a learning environment, such as a MOOC or a blended course, contents could be created exclusively by teachers and educators, or in alternative, existing contents could be used, reused and edited. Both these strategies presented advantages and disadvantages. I tried to summarize some advantages and disadvantages of the two strategies. Educational content creation, for example, presents as advantage a high level of personalization. For example, teacher can decide which language to use, which kind of media to use, a video, a PDF or a book, and how to organize the content. And, as said by Oxford in 2007, you can use educational resources, for example, to promote yourself or to promote your institutions. However, there are some disadvantages. For example, create new educational resources could be sometimes time consuming, and require a lot of time in designing. In addition, there are needed good design skills and digital skills to design new educational resources. Moving on the advantages of educational content reusing, the first is, of course, reducing costs in terms of designing and time. And, in addition, you don't need a high level of design and digital skills. One of the problems is that you have low or medium level of personalization, and especially for people that do not speak English, there are difficulties in finding good OERs in all the kind of languages. Why OERs are valuable? Speaking about open educational resources, it's important because they include not only the advantages of educational content creation, but they dramatically empower the advantages of educational content reusing. According to Tommy, openness is about the right and the ability to modify the package and add value to the resources. This kind of openness blurs the traditional distinction between content creation and reusing, and between formal, non-formal and informal learning. Here I provided one of the possible definitions of open educational resources. The definition provided by Oxford in 2007. Open educational resources are digitized materials offered freely and openly for educators, students and self-learners to use and reuse for teaching, learning and research. Looking at this definition, we see that the concept of open educational resources is both broad and broad. A wide variety of objects and online materials can be classified as educational resources. And, in this slide, you can see all the kinds of digitized materials that can be classified as open educational resources. We can find three kinds of digitized materials, learning content, tools and implementation resources. OERs are not only singular course components, but also include wall courses, or for example museum collections, open access journals and a reference work. Over time, the term has come to cover not only learning content, but also learning and content management software and content development tools. Finally, OERs includes implementation resources such as standard and licensing tools for publishing digital resources, which allow users to adopt resources in accordance with the cultural, curricular and pedagogical requirements. In this slide, I have tried to analyze the three elements of the term open educational resources. The first one is open, for me, defined openness, including two features, social and technical features. The social domain concerns the freedom to use, contribute and share resources. Concerns to the social domain can be, for example, the copyright or the price of access or accessibility. Regarding the copyright challenge, the Creative Common License is the best known and most often used open license at present and offers a number of sharing options. Accessibility can depend on individual capabilities. For example, course content may be freely available in a language that user does not understand. Regarding the concept of the technical domain, the technical domain is characterized by technical interoperability and functionality. Open standards are important since they make it possible for different software applications to pair it together. Moving the second term, education, not only material produced for use in formal educational settings should be included, but also material produced outside schools or universities, such as newspaper articles and material produced for informal or informal learning content. According to Don's, he wrote in 2006 that it is not necessary a priority stipulation that something may or may not be an educational resource. Since learning stands beyond formal settings, a resource used in an unknown formal setting may still be defined as OER. Finally, the last term is resource. We saw before some kind of digitized resource, and resource can be defined as anything that can be used to organize and support learning experiences. As we saw in the previous slides, we intend mainly digitized materials. Now I'm going on, but I hope at the end we can speak about your answers about this presentation. Okay, now we go in the core of our presentation. Why is it important to think about open educational resources assessment? A systematic approach to OER's quality assessment is particularly important to make decisions about which existing resources you should or should not include in your learning path. The rapidly growing number of learning materials and repositories makes the issue of how to find the most relevant and best quality resources. In repositories such as Merlot, Connections, OPER Learn and others, there are hundreds of thousands of pieces of content or materials, representing thousands of freely available learning hours. Although the dominant language so far is English, translation of resources combined with a growing number of non-English OER projects bring to a greater language diversity and increase global use, the potential number of users is enormous. Here I provided a definition of quality. Quality can be defined as appropriately meeting the stakeholder's objectives and needs, which is the result of a transparent participatory negotiation process within an organization. In the context of OER, quality can, for example, mean that a teacher finds a suitable resource for his or her teaching. The elements of quality can be defined in five elements. The first one is the efficacy. The second one is the impact. The third one is availability. The fourth one is accuracy. And the fifth one is excellence. Efficacy can be defined as the fitness for purpose of the object. For example, in the context of OER, this might include concepts such as easy of reuse or educational value. Impact is a measure of the extent to which an object or concept proves to be effective. Impacts depend on the nature of the object, the context in which it is applied and the way the user puts it to use. For example, how much the OER impact on the learning processes. Availability refers, for example, to quality such as transparency of the easy of access to a resource. Accuracy is a measure of precision and absence of errors of a particular process or object. Excellence compares the quality of an object or concept to similar objects or to its potential quality. There are different levels of a quality process. And they go from a most generic level to a most specific level. The most generic level is the quality of the organization. At the second level of this pyramid, you can see quality of courses. There are many quality approaches for the courses. For example, program certification. With regard to content, key features could be accessed automatically such as metadata quality, language and grammar, tag quality, as well as essential elements. Learning activities, media usage and technical correctness. However, when content changes rapidly and dynamically, it is essential to plan incremental quality checks. The third level of the pyramid is quality of metadata and another level is individual OER quality. Do OER fit the learning context? As quality is not a generic concept, user behavior and comments can indicate the quality of an OER in relation to the learning context. Transferability and adaptability is the last level. How can OER be contextualized? This is a key quality attribute regarding adaptation of language, culture, design, deductics and so on. In this slide you can see different kind of approach to the quality. The first is the generic quality approach. The second is the specific quality approaches and the third one is the specific quality instruments. It is useful to think about all the three kinds of quality approaches. However, in this presentation, we will focus on the development of specific quality instruments. In particular, we will see the peer reviews tools. We, in the Open Guam Miratus Plus project, we developed a grid to develop peer reviews of OERs and rating systems that we use to assess the quality of our minimums. According to OCS, there are also other kind of approaches. As shown in the figure, it can be a centrally designed or decentralized process. And the process may be open or closed. As I said before, in the Open Guam project, we adopted mainly a peer review approach for the OERs quality assessment. And we use also user comments for the MOOC quality assessment. An innovative common tool for the evaluation of the OERs is social ranking, which can be described as a form of crowdsourced peer review. In the Open Guam approach, we adopted a method that includes both elements of the traditional review and social rating. For example, in our case, actors can change role between a reviewer and a producer, depending on the context. And this makes our process more similar to the social rating. On the other hand, assessors were provided with specific guidelines and instructions, as in the traditional review forms. So we tried to combine the advantages of the traditional review and of the social rating in our framework. Now I can finally speak about the Open Guam Erasmus Plus project, and in particular the case study of our MOOC. The project is an European strategic partnership, dedicating to creating accessible opportunities to achieve virtual mobility skills and to ensure higher uptake of virtual mobility in higher education in Europe. It is funded under the European Erasmus Plus program. The project is expected to achieve seven intellectual outputs related to different aspects of the open virtual mobility ideation and implementation. In this slide, I show all the partners involved in the project. The Open Guam project is coordinated by the University of Beuf, and other European partners are involved. The University of Roma Trem, University of Timi Suhara, Open University of Netherlands, Leven University, Aunege, UNIT and WIP. The group of the University of Roma Trem coordinated by Antonella Portia is responsible for the output six, named OER, MOOC and Pilot. This output is aimed at designing virtual mobility OERs and the virtual mobility MOOC, and ensuring the project sustainability through a piloting phase. Here I present the six phases that we have been following, and in bold you can see all the phases that concern directly our quality approach. The first phase was developing the specific OERs quality assessment tools. In the second phase, we asked all the partners to search, create and select OERs. In the third phase, we conducted a social ranking, so a crowd-sourced peer review that I will show you later. And then in the fourth phase we integrated OERs in the Open Virtual Mobility MOOC. In the fifth phase that is in court, we have the distribution and attendance of the MOOC. Then we collect user comments and ratings, and finally we used the results to redesign MOOCs and OERs, following a design-based research model. The first phase, as I said, was developing a specific OERs quality assessment tools. So, at the Romaterra University, we created a grid which was composed by three main macro indicators that are quality, appropriateness and technical aspects. Our evaluation rubric was mainly inspired by another rubric that is the Achieve.org, and you can see here all the references that we used to create our evaluation grid. And you can see here our grid. So, quality, appropriateness and technical aspects were the macro indicators, and for each macro indicator, we had some specific sub-indicators. For example, in the case of quality, we need to know who is the creator of the resources, if an expert, which is his expertise. The creator authenticity, creator bias, for example, what is the intended purpose of the creators, organization affiliation, organization quality control. If the resources was previously reviewed, materials currency, how recent or up-to-date is its content? We decided to use only contents that are recent, and the type of assessment that is included in the resources. In some cases, we adopted also ERs that does not include the assessment, because in our project we developed new E-assessment quizzes connected to our ERs. Appropriateness was divided in these sub-indicators. Cleanness of structure and content. Eight pages topics. So, as we will see later, our MOOC was divided in eight sub-MOCs, which are intercultural skills, collaborative learning, autonomy-driven learning, network learning, media and digital learning, active self-regulated learning, open-mindness and virtual mobility knowledge. And finally, we needed to define the difficulty level. So, we defined three different difficulty levels, beginner, intermediate and advanced. Last, moving on technical aspects, it was very important to see the licensing status. We used only Creative Commons license in our project. The human accessibility is the resource accessible to people with disabilities, the possibility to remix or edit their resources, the technical accessibility, the technical quality and the number of questions presented in the E-assessment. The second phase was support partners or the partners in searching, creating and selecting ERs. The Roma3 team organized the work as following. Each partner has to find at least nine ERs related to one of the skills defined in the output one. The skill-assignment assignment was based on partner scientific expertise. We identified different kinds of repositories that can be used. As you can see, repositories are in different languages in order to support the internationalization of the project. However, we did not use only these formal repositories, but we also used, for example, informal learning platforms such as TEDx to look for videos and other kind of resources. And this is our phase of social ranking, crowd-sourced peer review. Partners had to download the ERs that they found in a Google Sheet. The ERs were selected and peer-assessed by another partner of the project. Peer-assessors could add comments and feedback as you can see, or, for example, propose alternative ERs. In this way, partners have the opportunity to compare their opinion about ERs that could be included in the virtual mobility mode. So you can see, for example, in the first column, there are all the links to the ERs. And in the other column, you can see that there are the sub-indicators that I showed before of our ERs evaluation grid. And you can see also the comments, so partners could discuss and comment the quality of the ERs in order to make the ER assessment a social process. Phase 4 was integrated in the open virtual mobility MOOC. As I said before, the open VR MOOC is organized in eight mini MOOCs. Each mini MOOC has the following structure. The first step that participants have to do is to fill in a pre-assessment. According to the score they obtain, they can be directed to the foundation level, intermediate level or advanced level. Each combination between the level and the mini MOOC is defined as a sub- MOOC. In total, we have 24 sub- MOOCs because we have to multiply 8 mini MOOCs for 3 levels. Each sub- MOOC has different forms of assessment. In the foundation level and in the intermediate level, they add mainly summative and formative assessments such as quizzes, while in the advanced level there are also e-portfolio and peer assessment. After participants complete the course, they obtain a badge that certifies the skills acquired in the specific sub- MOOC. This is the structure of each sub- MOOC, the minimal structure, so each sub- MOOC should contain at least one video, at least a text that could be a PDF for a PowerPoint, discussion forums and acquits or e-assessment, such as peer assessment and e-portfolio. The phase 5 was to distribute the MOOC and participants attend the MOOCs. In January 2019, 60 students from France and Germany joined the mini MOOC Media and Digital Terrasse. You can see here a preview of the course. At the end of the pre-pilot phase, students were invited to fill in an online questionnaire aimed at the designated participants general evaluation, participants' specific evaluation, participants' recommendations for improvement and hours spent to complete the course. 194 sentences and 259 segments were analysed through sentiment analysis using the software MiniCloud. Here you can see the results of the sentiment analysis. In most comments, in 47% of comments, participants express positive sentiments. In 17% of cases, participants show negative sentiments, and in 4% they are neutral, or in 32% they do not show any kind of sentiment. Positive comments are related to the known learning, topics, content, design, visual, videos and community. And the objectives are simple and clear. On the other hand, negative comments are related to denouncing the structure, e-test, e-portfolio, problems, tasks and adjectives confusing, much and unattractive. So we perform a qualitative analysis to see how these words are connected with the comments of the participants. Participants enjoyed the selected content, especially in form of videos. The basic level course related with the Creative Common Lines was the most appreciated. I put an extract. The course was a useful introduction to media literacy. It contained useful information about important topics in the Internet like verifying source in the Internet. That might be helpful for students. The videos on TEDx are fascinating. The instructions were easy to follow. All in all, I liked the course because it is a new way of learning a new topic. I hope the participants appreciated the opportunity to test their skills through e-assessment and e-portfolio. They also suggest to improve their portfolio functionality, as reported in the following extract. The participants would need a clear instruction regarding how to fill in their portfolios, providing, for instance, a template. And I reported another extract. The e-assessment was useful to reflect the stuff I have learned, the small tasks where we could. At first, creating my e-portfolio was difficult for me because I didn't have any example of how to start or write something. I also didn't want to post my tweet publicly. Students may think my skills are not enough. The tests are on a good level. We saw that participants had contrasting ideas about publicly sharing their reflection in the discussion forums or e-portfolio. While someone appreciated the opportunity to share their ideas with the community, others instead not. And I reported different kinds of extracts that have opposite reflections regarding this topic. Some participants found that the text contents in the advanced level were too long and complicated. And the instruction related to the exercises were not always clear. They did not appreciate it to be redirected to external links, both for e-portfolio and for content, because they lost the truck. The participants spent about four hours to complete the course, six minutes for the basic level, nine minutes for the intermediate level and nine minutes for the advanced level in average. Some of them completing the course within a week. The virtual mobility implementation needs to follow an iterative process of the ideation, design, assessment and redesign to be effective. In the open WAM Erasmus Plus project, the design-based research approach was followed to constantly monitor and improve the quality of the open virtual mobility MOOC. According to yesterday and colleagues, the six phases of the design-based research should be focus, understand, define, conceive, build and test. Output one of the projects was devoted to the first three phases, since the skills necessary to actively and effectively participate in the virtual mobility were identified and redefined according to the specific goal of the project. The last three phases are indeed part of the output six concerning the open virtual mobility MOOC construction and the pilot phase. The pre-pilot has consisted in conceiving, building and testing a mini MOOC named media and digital literacy. The pre-pilot phase is conceived as the first cycle of the design-based research approach. Design-based research is indeed an iterative and circular process. Although the general assessment of the course was positive, there is still room for improvement. The results were used to improve the designing of the MOOC, media and digital literacy and of the other MOOCs. We provide prompts and templates for e-portfolio. We tried to include more video and balance the use of scientific references with more interactive content. We tried to embed all the content in the MOOC. New mini MOOCs are going to be launched soon, so stay tuned. The first MOOC media and digital literacy is already available and you can go to this link to see the media and digital literacy course, to join it, to share it with your students or to try it. In the next weeks, the following MOOCs will be launched. Follow us and try our MOOCs and sets that provide feedback to improve them because we are in the pilot phase. Since the availability of OERs is dramatically growing, it is urgent to develop tools to assess OER quality in a more efficient way. Quality is a complex concept that depends on the perspective of different stakeholders, thus it is necessary to integrate different methods to assess quality, open and closed methods, centralized and decentralized methods, general and particular aspects. In the Open Virtual Mobility project, we integrated a crowdsourced peer review approach for the OERs quality assessment with the design-based research approach for the MOOC quality assessment. The results of the analysis are used to improve the quality of the OERs and the Open Virtual Mobility MOOC. Now I'm concluding, I will tell you that we will see you later on Twitter. Follow the Eden Chat at 6pm weekly. We will speak about evaluation and creation of open educational resources. And here I provided some references. I saw before questions about references, so we can start from here. Are you ready to answer the first question? Thank you so much Francesca, thank you for your detailed report, which we really appreciated. Actually, there's also another question that I will ask you and it's sent from Gerard. Gerard Casanova, who is also part of the Open BOEM group, is asking, Francesca, can you tell us more regarding the scientific validation of OERs? He is mentioning an example. If we use an OER, which is not scientifically proved like one, for instance, could be on global warming. If this OER says that global warming does not exist, how can we validate OERs from a scientific point of view? Yes, in our grid, I show you our grid because it's very, very important to consider the creator. If the creator is an institution, the authenticity of the creator. So we need to do a research about who is the creator. So in our assessment of the OERs, our partner needed to consider the effect with the creator of the open educational resources. And specifically in the Open Virtual Mobility project, we have experts of the topics that are included. So the topics are intercultural skills and we included universities that are experts in intercultural skills, or the same for the collaborative skills. So we tried to guarantee scientific quality by involving experts. Great, thank you. Yes, yes, in fact. We are able to assess. So this was an aspect that is cared about. I see that there was another question related to the availability of this presentation, but of course it is recorded and it will be uploaded on the Eden website as every event. We are very grateful to Francesca, especially because she will host also the Eden chat later on. And it will be interesting, I think, to have this follow-up on the Eden chat, because if you have other questions, if you have other, you know, also curiosities or different views to propose, you can do it on the Eden chat. You have also a little bit of time to reflect on what Francesca told us during this webinar. And if you have links, if you have other pages that you want to mention and to be shared on the Eden chat, that would be a great chance to do it. You know, what I was mentioning in the beginning, saying that we need to be a real community where interaction is the key word, that would be really a good opportunity. Thank you really Francesca, thank you so much. It was a great presentation, really. And I say, so see you later on the Eden chat. I say goodbye to everyone. Oh, the Eden secretariat is, of course, and Christina was helping us. Thank you so much, Christina, for your help. She is, of course, mentioning and I was forgetting about it. But we will be back very soon with another webinar on April 17. And the host will be Chiara Tuanni from the University of Graz, Austria. And the topic, it will be very, very interesting again, public archeology and online engagement. And a topic which is very much related to the conference. I'm attending at the moment because I'm talking from Madrid at the Moose Access conference. So thank you all. Thank you for being here. And see you very soon. Bye bye. Bye.