 Good morning and welcome back to this interactive session on the FTP201X Pedagogy for Online and Blended Teaching Learning Process. I am Sahana Murthy from IIT Bombay. In this session, we will mostly look at addressing your concerns primarily of peer assessment because we have been getting a few concerns and we have thought about this. We have some responses and later we will see if you have any concerns about implementing active learning and we will talk about that too. So the way we will structure this session, it is from 11.30 for about an hour or hour 15 minutes is what we have done here in our team is consolidated a lot of your concerns. So I will give a brief overview of what we have seen so far and I will try to address common concerns because that will be applicable to many of you and then we will open the floor for an open question and answer session. So what you have done so far, let us just look at a brief overview of what has happened since the first FDP-101X. So if you recall week 3 in FDP-101X, there was an explanation of what is peer review, what are its benefits, why to do peer review, how to do it well and so on. And in case you did not participate in it or if you do not recall, you can go back to the previous Faculty Development Program and watch LEDs, the Learning Dialogues 2.6 and 2.7 and do the reflection spots associated with it. The other thing I would like to recall is that there was an example review that was provided during that time where you had to, there was some essay provided and each participant in this program had to provide their review based on the rubrics given and then there was the instructor's review and then there was some matching and so on so far. So this is just to remind you of what we have done so far. And the other thing which you all have been doing since then is peer assessment and peer review with each resource creation assignment. So each RCA, there are four phases if you recall and we have provided rubrics, course, feedback and so on. So now let us briefly recall why peer assessments and what are the benefits. So since we have discussed this so far from our side, let us just do this as a joint activity and what we have created here is actually a think-pair-share activity. If you are very familiar with such activities, so I would not explain how to do the think-pair share. There are three phases and in the first phase individually, please write down in your notebooks or whatever writing implements you have one benefit that you perceive from peer assessment in this faculty development program. This is very personal and if you feel that well, I am not sure of what benefit think, reflect a little bit and see what benefit you have faced either as an assessor where you review somebody else's assignment or as somebody who gets feedback as the SSC and write down one such benefit that you perceive one minute for this please. If you have written this, please do participate in this exercise because the rest of it is about concerns but we have to look at benefits. Peer assessment is known to be very beneficial both overall in many studies in many universities and we ourselves have found that this is a very useful thing in such FDPs. So this is not going to go away. So what we need to do is learn how to do it well and we will fix the problems. We will work with you to fix the problems. So it is really important that you seriously think of the benefits and proceed from there. In the peer phase, what you can do is discuss with your neighbor and come up with one benefit for the person who is doing the assessment, who is giving the review and one benefit who is getting the review. So we are calling them assessor and SSC. So one benefit each, you may have written your benefit from either role but together as a pair, please write down one benefit for each, two minutes for this part of the activity. Please recall, this is the pair phase so this is where you have to be discussing with your neighbor. It is okay if three of you discuss together but together you need to figure out one benefit for the assessor and one for the SSC. The workshop coordinators, please encourage the participants in your RC to discuss with each other. This is not the individual phase anymore. Okay, we will progress to the share phase and here the workshop coordinator has a little bit of work to do and I think this, the format of the think-pair share over AVU by now has become very familiar. So what the coordinator can do is initiate a discussion with your, with the participants in your RC and get their responses and share the most common benefit from your RC. It can be either for the person who is giving the feedback or for the person who is getting the feedback. But please ask, please do a poll or do a initiated discussion among your participants. See what are the various benefits and then let us know on chat what is one benefit that is most common among your participants. Okay, we are getting a lot of responses here. Please keep typing, we will wait till we get a few more responses and when you share, please let us know whether this benefit is for the assessor or for the SSC or for both. Okay, we are getting some good responses from, we have gotten about 30 responses. We have almost 150 RCs today. So please keep coming, we will wait for another minute and then we will do a discussion of it. These benefits can be for either the assessor or for the SSC or for both. So please write down which ones you are considering. Okay, so I am going to discuss what some of you have written and the chat is closed now. So let us look at what some of you have written. Okay, many of you are saying that both the assessor and the SSC will be benefited but let me see some specific ideas that you have written. Constructive or genuine assessment will help because it will precisely point out errors and room for improvement and the desired improvement will happen. So this is one of the benefits, yes that is correct. Assessor can understand the different participants work and compare with their own work. This is a very important learning because even the person who is giving the feedback can learn from the responses, from the answers and solutions of other peers. It increases involvement of the participants or the learners in the course very much so. It is, there is a lot of engagement with this activity, there is a community building because a lot of peer review, in fact there is one of the foundations of building a community and all scientists, engineers, the scientific community works on this process. This is from one of you again. The assessor has benefits to know the standards on which he or she has to assess, very good because when I am trying to assess somebody's work, I have to figure out what are the criteria and what are the standards. So my understanding of the standards become more clear. And the SSC can get additional ideas on how he or she can improve based on the thinking, that is correct. There is collaboration, definitely we talked about that we will get more ideas. This is something that has commonly come up. One of you who has written higher-order thinking, we will see how this happens. You are absolutely right that the peer review process does involve a lot of higher-order thinking. So let us actually go on to the back to the presentation and let me summarize the benefits that people who have worked on peer assessments and people who have implemented peer assessment in various settings, let us see the benefits that they have mentioned. So overall what has been reported is that peer assessment as well as self-assessment, even though we are not focusing on it too much today, but both of these improves learning. This result has come out time and again. So how does it improve learning and what types of learning does it improve? Well, it improves oral presentation skills. It improves writing skills more than the context, more than the condition when it is only the instructor who provides assessment. So peer assessment is better than instructor-only assessment for oral and writing skills. This result has been shown in many empirical studies, in many experimental studies with data. Peer assessment also has been known to improve understanding of the domain and process skills that are involved in different technical domains. So it is not just for collaboration, not just to get ideas, but actual learning has been known to improve. There is a lot of advantage for teamwork and enhanced teamwork skills and as you know when our students go out into the world as professionals, it is critical that they work in teams. So knowing how to do peer assessment accurately and reliably is a very good skill for them. So it helps groups form a common vision for projects. So there is a lot of benefits for teams. And let us look at one last and important point here. You may not believe it right now, but there is a lot of data on it and we can share with you some of the data which says this that a peer assessor like a student with less skill but more time can produce assessment of equal reliability and validity to that of a teacher. So there is also what this says is there is also benefit of peer assessment in terms of efficiency of the teacher because a teacher may have to evaluate 50 or 100 students or if you look at us for this FTP we have 5000 learners. There is no way we can do any assessment of 5000 learners and automated assessment only goes so far, but we do want to give feedback. So peer assessment with some training and with some clear criteria has been shown to be as effective, as reliable as the teacher who has less time. So in this let us look at the final take away why we are doing peer assessment in this FTP. So it is an important professional skill and what we have learnt is that we should not avoid it. In fact we will be doing you a disservice and you will be doing your students a disservice if we avoid it. And there is no need to avoid it but instead what we have to do is solve the problems that are there in this process address the common concerns and yes there are concerns which is why we are doing this session. So as the bottom line we will do it, you will do it, you will do it with your students and we will see how to address the concerns. So now let us come to something that I think all of you have been waiting for. So let us look at your concerns and here what we will do is actually a poll. So on the next slide I will just show you what the poll is and then we can go to the poll and I will start polling. There are four choices here which will show up in the poll on which according to you is the most serious concern and peer assessment in this FTP. There are four choices please go through those choices when I put up the poll and then you can vote which is the most serious concern in your RC. There are four choices I got a harsh review that means I got to lower score by the assessor I got a very lenient score to higher score by the assessor I as an SSC as a person who is getting feedback do not get detailed feedback and when I assess I do not understand how to review. So the first two are from the assessor's perspective the last two are from the SSC's perspective. So which do you think for you is the most serious concern? So let us we will keep looking at a few results and then we will discuss it. We have about 75 RC's which have actually participated we will just wait for another 30 seconds or so. Another 10 seconds we have over 100 RC's but just over 100 so I will wait for another 15 seconds let us say and then we will close the poll. Let us let me tell you what we saw in the results. So these were the four choices we got responses for each of the four options. So the most serious concern was number three I do not get detailed feedback about individual criteria about half of you said this about a third of you maybe 50 people or so or even less than that 40 RC's or so said that I got a too harsh review and then there were people about 30 each who poll number two and four as the most serious concern. So some of you are saying that I got too lenient to score and some of you are saying that I do not understand how to review. So what I see is that there is almost a if not an equal but there is sufficient distribution sufficient representation of each of these concerns and these are really the most common concerns. So what we will do in the next 10 minutes or so is go over each of these concerns and see how to address these right before we do that let us go back and see what has been reported worldwide when people implement peer assessment just like for the benefits we saw what people have reported let us see what people have reported with respect to the concerns. So concerns in existing peer assessment instances this is not this particular FDP but this is where people have implemented it in other universities and other institutions and the context here is several undergraduate courses different undergraduate courses including a large number of engineering courses. So these the places where peer assessment has been implemented is similar to your courses. Here are some of them this is not exhaustive but these are the most common ones. So students often are reported to rate higher than the teacher very rarely is it reported that students rate lower than the teacher. So students tend to be more lenient sometimes it becomes a problem because what is called cronyism that is I give my friend a higher score and my friend gives me a higher score so there is some understanding. So this has been reported occasionally the first one students rating higher than a teacher is reported more often and there is concern among instructors who implement peer assessment that this is unreliable because students do not understand the process very well. So these are some of the common concerns and I think some of you also have these concerns. So to summarize let us look at what the take away is from peer assessment studies. So it is an important professional skill we should not avoid it it is it can be reliable it is reliable when two conditions hold one is that operational definitions are given and if you recall in this FTP we have rubrics with a lot of detailed criteria that is what we mean by operational definitions of the criteria and it is been said this is number 2 that peer assessment can be sufficiently reliable to include it as part of the grades there is no problem including peer assessment as part of the grades after the training phase. So there is some training that is required and let us look at two other points. So if training is required what we need to do as instructors is develop learners assessment skills by addressing common concerns which is what we are doing today and we have done a few of those earlier also and providing sufficient training. The last point on the slide is the most important peer assessment improves with practice. So if you found that the scores were unreliable in the first RCA or the second RCA by the time people get to the fourth RCA the scores will become reliable provided each of you seriously applies the criteria after understanding it. What is good or well done peer assessment? The focus is not so much on the grades let me repeat it the focus is not so much on the grades instead it is on the giving and taking of feedback. So as an assessor what we request you is consider it as an exercise in giving feedback not on giving marks or giving grades. So your mindset has to be here is something some solution I got how can I help the other person improve it. Peer assessment engages learners the people who are assessing in higher order thinking because as an assessor you have to reflect you have to make judgments you have to understand the criteria so there is a lot of higher order thinking skills. So it is a very easy way of developing these higher order thinking skills by yourself doing this assessment exercise or by asking your students to do this assessment exercise. It promotes learning of both assessor and assessor we have discussed it and it should be frequent why let us look at the last slide peer assessment improves with practice so we need more practice so it should be frequent to do it well yeah okay. So how specifically to do peer assessment well so now I am going to talk from the perspective of this FDP we need to clearly explain the expectations what to do in peer assessment why it is important and we need to provide training to learners on how to do peer review because often people who have not participated in this process would not know. Even as instructors you may be used to grading but you may not be used to peer assessment from the perspective of giving feedback and doing a review. On the other hand if you have participated in a peer review process in when you evaluate papers for a conference for a journal there you have participated in a process where feedback and review is more important that is not about grading. Next provide clear instructions on what the assessor should do at each stage so if you remember the four stages we had each stage has clear instructions and explain in detail the criteria to be used these are the rubrics and ask the assessor to give detailed comments and feedback in order to do peer assessment well. So let me just walk you through the things that you have done so far that we have seen done that you have seen so explain clearly the expectations why it is important you have done peer review on week 3 LED 2.6 that learning dialogue where we talked about why to do it who benefits what are the expectations and so on. So we had training to learners on how to do it if you remember in the previous FDP last month there was an example review where there was some essay given you had to provide a review and match your review with the instructor's review and you had to iterate till this course were reliable clear instructions on what the assessor should do again if you recall the learning dialogue 2.7 this is where it is. So why I am showing you all this is if you do not remember where these were just go back and work on these learning dialogues including the reflection spots explain in detail the criteria used in the review. So recall the rubrics these are not just scores in fact there are no numbers here it say these are the criteria on the left hand side column the criteria is each criterion is either absent or needs improvement or there is a target performance and it is descriptive. So request and recommendation and I would say mandate to each of you is please read each rubric cell carefully before doing a peer assessment and during the assessment unless you understand what these criteria really mean then the process is not going to be effective. So take a little bit of time and go through these criteria and their performance indicators. In order for the assessor to give comments and feedback we have provided a text box to give feedback. So when you are assessing you must give detailed reason in the feedback box in fact that is one of the common concerns that came up even in the poll. So see this is a community effort and the more you participate the more you contribute the more you will get out of it. So this request is for everyone this recommendation is for everyone please read the rubrics carefully decide spend enough time and give the detailed reason in the feedback box without it it is just course it is not very meaningful. Peer assessment improves so we are doing it more frequently. Few other concerns of yours. So many of you or some of you actually not many of you I would say about 15 percent of you have said that there is a harsh review by the assessor and what we have done is provided an additional level of mentor review as an offline activity for this. Some of you have said that you do not understand what the process of peer review is what the criteria are. So again I would suggest please go back and view and interact with these learning dialogues and finally the most important one that people are saying each of you many of you are saying that there is not any detailed feedback on individual criteria. So I can only repeat what I said a couple of minutes ago please give detailed reasons in the feedback box for each score not as an overall thing for each row in the rubric item right. So what we will do now is for another 15 to 20 minutes we will address other concerns especially regarding peer assessment we will do an open question and answer you can either do a hand raise or ask in the chat window we will address some of them now and then we will come back to an open session today afternoon at from 2 to 5 pm we can go back to these concerns. We will make this presentation available it summarizes why peer assessment what are the benefits and how to do it well and what you need to keep in mind. So when you do the next RCA and the peer assessment in it please go back to this presentation and look at this points there ok. So let us look at CDAC NOIDA this is Centre 1290. I have actually a question to you this is about the RCA2 review ok. What happened was I am actually or rather we are doing or rather we have done the model review or the model peer assessment by giving the rubrics and the file was given by IIT Bombay. And for example there is one column which says that something is going to be exemplary if for a say video was cut into small portions of 10 minutes or less and then that use some numbers anyway the numbers is not going to be very important. But once we try to evaluate that to the model peer review because that whatever given was not exemplary that is a bit less than exemplary. But once we do that model peer review our opinions are not matching even though that is also the truth. So like if the rubrics for peer review were not so clear what would be the fate of that peer review is my question. So you are saying that the reason for the mismatch is because the interpretation of the rubrics was not clear. Let me just clarify is that what you are saying? It was given is not does not exemplified what it exemplifies is that a 40 minute video should be made to 10 minutes or so using some software then only that is called as exemplary in the rubrics metric. But the evaluation score that was given for the IIT Bombay that was given out is that 4 even though the entire video runs for 40 minutes. So now this actually creates a set of doubts while reviewing then the decision making for us becomes a bit difficult. So once we start going heavier then that peer review can go anywhere but then the biasing starts to come in. This is a comment the question is that how important it is to have rather a well defined rubric or a metric before the peer review happens. This question is a valid question that how important is it to have a well defined rubric and as you say having a clear well defined rubric is crucial in a peer review because if there is misinterpretation or if it can be interpreted in multiple ways then the reliability starts going down. On the other hand what sometimes is done is only some criteria are included in the rubric for deliberate reasons. So as an instructor you may give a full assignment and you may decide that only certain criteria are important or you may decide that for this assignment you will focus only on certain criteria. So the rubric you may create may only have a few criteria. So when having a clear and well defined rubric is very important but sometimes you may choose not to have all criteria in the same assignment which means you are focusing only on some of them. If you find that some of the rubrics that we have provided are not clear then please post saying that you do not understand or there is a problem in the wording or there is a problem in the interpretation then we will look into it. But once the rubric is provided all the assessment is based on that rubric. Actually that question was posted out but we never got a recollection. So please post it with this subject line saying that the rubric criteria are unclear or there is multiple interpretations and then we will look into it. I have made a note of this we will take a look at that. Thank you. Let us go to the next question from K. J. Somaya College of Engineering Centre 1016. Hello. I have faced this in appropriate peer review last two times. In association 1 I was being evaluated 1 out of 5 and this I am 0 out of 12. I tried consulting my mentor but mentor is not responding at all. Let me address this question for all of you. See in a common interaction session we know that a lot of you have trouble with this level of mentors and we are working on it. So if I have marked all of these you can say it here I cannot give you a response beyond saying that we are working on it right now ok. What would be beneficial to ask in this interaction session would be questions which are a little more broader which we can address that what kind of feedback should I give or if I do not understand the criteria what should I do. If your mentor has not responded what we can say at this point is maybe what we can do is start off one discussion thread we will do it ourselves where you can log it where you say this is the RCA this is my mentor and then we will follow up ok. I am trying to note down right now but if it is there in writing it is a lot more useful so we will start off a thread on that count. These are all procedural issues and procedural issues can be sorted out via procedures put in place. Conceptually we cannot sort it out you know in this face to face interaction the conceptual problems are more like how do we make sure that the process is reliable or how do we make sure that we understand and do not misinterpret the rubric. So conceptual issues are what are more suitable to be discussed in this forum right now. Okay ma'am thank you but there is one more question. Okay please go ahead. Hello ma'am. Yeah. Today you showed us what are the benefits of peer assessment but I would also like you to highlight how a wrong or incorrect peer assessment badly affects the learner because I think that would go a long way in making other people do a right peer assessment. Okay good point, point taken we will include this that a poorly done peer assessment or a not seriously done peer assessment is actually it is causing some harm in the whole process. And each of you is playing a dual role at times you are giving feedback and at other times you are getting feedback and it is you are playing both roles with equal importance. So if you do not do a good job it is very likely you know there is a cosmic karma which because of which maybe you may not get useful feedback. It just may so happen that in one case you may have given good feedback and you did not get useful feedback in for one particular assignment but as there is more practice and the whole process improves so individuals have to improve and the process also has to improve then hopefully a large number of you will give and get useful feedback. There is one more thing we can do that is at some point we can pose in addition to the rubrics we will try to see we cannot include a model answer because these are resource creations but we can include some of the criteria that are there in good answers or we can actually post some good answers which we have seen and you can do a self assessment. So self-assessment is known to be as effective if especially if you know what the criteria are and if you see exemplars. So you can also use that after you get back your peer assessments course and you do not agree with that. So trust yourself in that case. Hello ma'am one more question. Okay last question from this center and then we will go to the next center go ahead please. In every bit 201x there was a question that in not more than 250 words you have to give your response. So I had got a response for assessment that was of 440 words I mean I counted those words. So should I follow the rubrics I mean if I follow the rubrics I have to give the response as full marks but is it fair to give full marks? See here you can take a call and this is where the feedback box is very important. If you feel like being lenient there you can say that okay I have you followed the rubric but you know this is really not fair because you have doubled your number of words please be alert next time you can do it that way or if you choose to do something else use the feedback box this is exactly where it is playing a role. If you feel like saying that I am not going to even touch this assignment because it violates some really important rule it is your call write this in the feedback and move ahead. Don't misuse it of course but as a teacher you know you often take you think about it and you take a call as to what is the right thing to do at that place and then you explain your reasons up front. Okay thank you ma'am. All right let's go to center 1160 Anna University Madure. While assessing the RCA02 somebody has seen that it is a teamwork and in the PPT slide also they have mentioned it is a teamwork how can we evolve it such a pair. Okay let me take this question in a slightly more general form that so let's do it in two cases. Firstly if some assignments are provided as teamwork how do you assign scores and who gets which score and secondly what to do with RCA02. So the the specific one for RCA02 if it was an individual assignment then it cannot be the same score for multiple people. Any such exceptions flag them put it in the feedback box and then see how to go ahead with that you know because you can give scores but you don't know whom to give the scores right. So just assume make one assumption here if you want assume it's one person give your scores and go ahead that's for this specific assignment but as teachers if you want to assess teamwork and then do peer review you can put in place a large number of rules. So what people usually recommend is that in teamwork if the entire team has created the created some artifact like written a report or done some project together then each team member actually has a place where they write what they did but the same score is given or the average score is given to each of the team members that's the whole purpose of it. It's very difficult to take an artifact made by three team members and then try to give different scores. If the team members know that each of them will get the same score if there is a rule that they have to write what each of them did they'll self assess they'll make sure that each of them pulls the load. So we haven't talked so much about how to assess teamwork and peer review but there are some guidelines even on how to do it. If you're interested we can actually post some general guidelines. I don't believe we'll be doing that for you in this FDP but it is possible for you to implement it in your course projects. Okay I'm going to have to get back to you on this. I don't know the answer of hand because I did not create the assignment. We'll get back to you on this but if I had to give you one answer right away I would say yes it's one it's totally three people. Now what I'm going to do now is go to chat and look at a few questions there and then I will come back to the interaction. Okay there is an interesting suggestion from center 1016 that to address the issue of peer review at their RC what they've done is requested all participants to give feedback of each other. This may not affect the score but we will get more reviews. This is actually a good idea see if you can do it in your center and the coordinator will have to do some coordination work but this will actually improve the process because you'll get a larger number of comments and then the reliability will also go up. Here is another suggestion we will take into account if somebody exceeds the word limit we can try to solve this issue from the software perspective. We'll look at this next time yeah definitely but if this if you find this problem and if you say that look this was an important rule and we won't we will just give this much feedback and we'll not look at the assignment free go ahead and do that. Okay should we give a detailed feedback for peer assessment or is brief feedback sufficient this is an important question. What you need to do is give feedback for each criterion which means for each row in the rubric you have to justify why you gave a score of 2 or why you gave a score of this is not adequate what the person can do to make their score from not adequate to exemplary. This is the kind of feedback required you're not giving domain based feedback you're not giving generic feedback but you're giving feedback on why a particular score has been assigned by you for this assignment that is what will make the process very transparent and clear to the person who is getting the feedback. So there are some questions about RCA02 we will look into it because it seems like there is some broader problem. We'll definitely look at RCA02 since many of you have that problem. There is a request from some of your centers that since peer assessment plays a major role in higher order thinking the request is peer should read the rubrics and give serious feedback definitely thanks for posting this again. Okay let's go back to the interaction we'll take a few more let's go to center 1080 GRET cook at Pali. The first thing is the peer assessment need to be done discipline centric or domain centric you have given 10 domains. So if this is been distributed within the domain peer that distributed these two SS that would be more better to linkage of learning objectives and saying whether the learning object is specific or measurable or not the specific specialized person can do justice better. I'm going to take this as a suggestion for our team and we'll try to implement this I think we do have plans of implementing this in one of the OER creation RCS we will try to implement this it's a valid suggestion to make cohorts discipline wise and try to do it because sometimes you do need domain knowledge in order to give a valid score. Okay what's your second question? You also introduced pharmaceutical sciences as a domain. We will try I can't promise it right now because most of the experts we have are from the engineering and the physical sciences so we will try and if we say yes we allow it then that means we'll have to give examples and we'll have to make sure that there are assignments valid assignments and we don't have the in-house expertise. Okay so the more narrow we keep our target audience the we'll be able to provide feedback and all at a much higher quality so that's why certain domains we are restricting at this point. My second question is is there any researcher literature says that can we introduce peer assessment in classroom? Oh very much so so let me repeat the question can we introduce peer assessment in classrooms there is a lot of research on it and the studies the results from the studies I showed today are from peer assessment implemented in engineering classrooms with first, second and third year students fourth year students and so on. It has been done successfully and let's let me just make a note where we can upload some of these studies okay how to do it in classrooms. One place where you can start doing it is in project reviews so if you give group projects in third or fourth year for the mid-semester reviews not the very final review but for the mid-semester reviews you can have peer review as one of the processes that you implement. So let's look at where it is good to use peer review and where it is not so suitable. When you have creation of artifacts like big programs or projects portfolios reports presentations team projects in such cases peer review is a very useful tool. If you have problems where the answer is very very specific where some formulae have to be applied to get to a numerical answer. Peer review is an overkill in such cases okay because if the solution is very discrete and unique then you might as well do self-assessment in such a case or if you say grades are very important and reliability is important then you have some TAs or the teachers to do it but the purpose of peer review is learning from both sides sharing of ideas getting multiple ideas making judgments at higher order so use it in assignments where there is some rich artifact being created yeah and yes there is a lot of instances in engineering classrooms so we'll upload a few of them okay okay thank you let's go to center 1032. Ma'am we have concerns regarding mostly the peer assessment okay what we understand is that the peers are not understanding the rubrics properly if there is a slight deviation in implementation of the assessment by the participant the peers are not able to justify that. See this question is coming up from all centers that the assessors are not understanding the rubric or they're not applying the rubric carefully right so if all of you are saying it it means all of you are responsible at the same time so I'm not saying that each one of you is doing something wrong but it's a collective problem that we have to solve collectively so especially after this session the two or three main takeaways were back in one of the slides so let me just put that slide back okay so how to do peer assessment well the first two bullets are IIT Bombay's responsibility which we have done to some extent we will keep focusing on this which is explaining the expectations why peer assessment is important and providing trainers. The others are actually a collective responsibility for each of you especially the last one the assessor needs to give detailed comments and feedback the assessor needs to understand the detail criteria and use them and explain why they gave a certain score so I'm just going to keep repeating this over and over because that seems to be the problems that you have had and that also seems to be a solution a good solution please read each rubric cell carefully before and during assessing each of you please say that for the next two weeks I'm going to do this each of you please give detailed reasons in the feedback box for each score that you give I think if each participant follows these two guidelines then a lot of the problems will disappear for the mentors the recommendation is that make sure that these two points are followed if they are not followed then take the steps to fix them. Let's go to center one two three five Dr. Mahalingam College of Engineering Ma'am actually we have two questions wherein one you have already answered regarding the RCA 02 regarding RCA 01 unfortunately I posted my assignment wherein I was not I didn't do the peer review what will happen to the person whom I have been allotted for will there any grading being reduced for this okay thank you very much for your honest response okay thank you for this let's do one thing can you just post a note to tell your mentor this happened okay ask them to score it or send them your scores and we don't want to you know punish that that person because of some oversight so if you have now done the review please send the scores to your mentor and then your mentor will get to us and we'll fix it okay let's go to center one three four three and we are joined from there by professor D.B. Fatak hello professor Fatak Good morning Professor Sarnamurthy so first of all I'm very happy to be personally present in the supreme foundation and there are a very large number of faculty members it's really amazing to see the enthusiasm even on a Sunday so thanks supreme people I'm also happy to address all the 6000 participants from a remote corner near Kolkata and just massage there is because of this technology that we are able to interact like this coming back to the basic theme of the fdp I once again notice that people are deeply concerned about inadequate or improper peer review marking and because of which if I'm one of the participants I may end up getting very low marks although I might have produced the top quality work frankly there is nothing that can be done about it except that every person participating in the peer review process must take that task very seriously Professor Sarnamurthy has already emphasize this and if we practice the suggestions given by her and others I think we will slowly improve one more thing I would like to add peer review assessment has not been part and parcel of our normal academic activities after all we rarely ask our students to peer review other students work which is what we should actually start doing in the classes everybody must learn how to assess others work properly and very very carefully now this we have to inculcate ourselves it's not very easy I believe that the perfection will happen only when we have practiced this approach at least on several occasions and this fdp gives us one such occasion to do that concurrent with the notion of peer assessment is the peer contribution or peer collaboration yesterday I mentioned about the collaborating communities and their importance Professor Sridhar Iyer has also suggested how critical it is to form communities and develop best practices at your own place ideally a few faculty members working in a specific domain could come together and decide that they will practice not necessarily the entire course as I mentioned yesterday but a few topics where they will try to use this pedagogy and this technology and try to see whether the teaching is more effective in the sense that whether learning by students is better please remember making our teaching better is important but is not the main ambition making learning better is our ambition and therefore we must get into the minds of individual students understand how they function and try to optimize their learning by using appropriate pedagogy the second and more important thing that I would like to repeat I think I already mentioned it yesterday like the last fdp we could get about 250 performers who have willingly agreed to become our associate faculty going forward I would expect several of you there are now 6000 participants in this fdp so I would expect at least 300 or 350 faculty members who come out to be top performers and they are willing to become mentors for the future courses please remember the numbers are humongous my friend Anil Shastrabudde chairman of EICT tells me that they are more than 350,000 unique teachers in the professional education and he wants all 350,000 teachers to be well-trained in use of modern technology and in use of the pedagogy it is impossible for that task to be done by IIT Bombay and IIT Kharagpur alone it is impossible for that task to be done with only 150 or 200 remote centers participating we need to enlarge both the number of places through which we reach out to the teachers for which we will be actually adding what we call supplementary remote centers which may not have interactive facilities but they could form together a group and the group attending from that college could at least watch the live streaming of these interactions and there would be individual mentors who would be willing to mentor them that is the model we propose to take forward and for which we require many more associate faculty who have to come from within you my ambition is that by next year march if we can form a group of about 1000 to 1500 such mentors we can easily reach out to 350,000 teachers within next three years but we'll need support from all of you and that support cannot come only individually that is why I will stress again the importance of forming active groups and active communities within yourselves coming together participating in this fdp and learning something useful is important but that is not good enough you must continue in the same passion in the same spirit and believe me you will not be able to continue with maximum impact and effect if you do it alone going forward I strongly suggest that before you disperse today I suppose the teams have been formed which will be doing some team activities now keep these teams alive even beyond the fdp form new teams spend at least 15 minutes every week in assembling in the groups of five or six with a sole agenda of discussing how to take this mission of adopting new technology and new pedagogy if you do that regularly slowly but surely your entire thinking and approach will change much faster this is my belief because this is exactly what has happened with us many of us who started practicing these things thanks to the autonomy given to us by IIT we started doing this thing five years ago but then invariably in groups we will discuss these things and of course IIT Bombay has an advantage is an extremely strong research group with about 30 or 40 research scholars working towards their phd in education technology now that helps in group activity but there is no reason why every participating institution cannot actually have a few teachers who are interested in pursuing research in educational technology not necessarily from the sole objective of publishing papers although that's a good objective but with an additional objective of actually coming out with some discussion coming out with some new points practicing some new points giving feedback and so on as I mentioned we are creating a software platform for collaborating communities we propose to release the beta version by December end we'll have a lot of wikis we'll have open source educational resource repository in using dspace and we will permit communities to form communities to discuss communities to opine etc etc you may be able to start using that platform from January but platform or not the basic conviction that is required has to come from your own mind I must be convinced that if I do this activity not alone but as a group everyone in the group will benefit more than just the individual effort that's all I wanted to say and once again I would like to thank all of you for showing this great enthusiasm in this completely offbeat topic of using technology and the appropriate pedagogy but will you be in next 10 years there is going to be a turmoil in the entire education system in the world and for once we did not lag behind the developed world because nobody in this world knows things better than others everybody has been experimenting with these technologies and this pedagogy only in the last 5 to 10 years that means for India we can actually create an absolute world-class teaching learning process environment and you shall be regarded of course as pioneers in that effort thank you so much god bless you over to you Sana okay thank you very much for those words professor Fatuk indeed very inspiring words encouraging words and also a lot of practical useful suggestions in fact your word should be the concluding words for this session but let me just summarize the whole morning and in fact perhaps this whole fdp so far while this session was on peer review and the previous session was on doing collaborative projects the larger picture is about building community and sustaining community and as professor Fatuk said each one of you has to participate enthusiastically seriously and what we will do from our side is to try our best to facilitate that this process happens smoothly for all of you to contribute well by professor Fatuk see you later