 Good evening. Our project was to make the peer cement tool for the Sunbird platform. So basically we started with understanding the edX Aura. Since they have already implemented the peer cement tool and that is the biggest peer cement tool that is being implemented today. Moreover it is open source. So we started with understanding the open edX Aura's documentation and this code. So this is the overview of the work that we will be discussing today. So this is something about the open edX Aura. So basically these are the steps involved in the LMS part, learning management system part of edX Aura. It starts with submitting your response and the next phase is to learn how to assess response. Since the peer assessment process involves a lot of complications. So initially we must ensure that the user who is performing the peer assessment must understand what is it. The next phase comes assess peers. In this phase the students assess the responses of other students who are also enrolled in the same course. And the next phase is assess your response. Since you have learned that how to assess the responses and you have seen how you have given the marks. So you cannot judge your response. Since the point of view becomes wider so you can correctly judge your own response. The next step involved in LMS part of edX Aura is staff grade. The course mentor assesses the response himself or herself and the next is the procedure for evaluation of grades. In edX Aura what they do is they find the median of all the criteria, they sum it up and they convert into a grade. And when the course has the staff assessment involved since it is optional all the courses do not have that. If it is involved the final score is the score given by the staff. So we also thought about exploring other peer assessment tools that are available and are being implemented. So one was Spark Plus it was not open source but it was being implemented on a wide scale. But it was not required at it was used to assess the contributions of the individuals working in a group. And similar tool was WebPA but it was open source but it does the same thing as Spark Plus. Another very good tool was peergrade.io. It has some unique features that are different from Aura. For example it includes a rubrics library. edX Aura does not have that. You have to manually write the different options, criteria and it is a very hectic job. But in peergrade.io you can do that automatically. So this is the comparison that we came through between open edX's and sunbird platform. The open edX is modular, interdependent in code base. So if you make a change in an in a X block then the change would be affected in some other part of the code as well. It is highly coupled whereas the sunbird is decoupled. edX uses Python, Django, MySQL, MongoDB. Whereas sunbird uses Java, Node.js and Apache Cassandra. So we started with installing the sunbird portal. For installing the sunbird portal it is specifically specified that you have to have the Ubuntu 16.04 LTS version. We tried using with 18th version it does not work. You have to have the Node.js version 6.14.2. The newer versions are available but it does not work on that. The NPM version has to be specifically this. Another issue is that you have to get the API key from the sunbird guys. You have to mail them and request for the API keys. And moreover for the issue is that you have to manually set up the environment variables for sunbird. It is not mentioned in the documentation. So it is an issue. So what we have done is we have adapted the data structure of somewhat similar to ORA. We have created almost 10 tables. That is I think the minimum amount of tables that you have to create to create a peer assessment tool. And we have uploaded the database on the cloud server from dbfree.net. So the next part my colleague Khushbu will explain. Our first approach was developing the GUI prototype. We developed it in GUI. Then we included the login and registration frames also. But when our tool would be integrated with the sunbird platform it would not be needed as this is required only in the developer version and we needed a database. Now this is our first frame. This would overhear the instructor would be setting the questions and learner must be graded by minimum. How many number of peers and one must grade how many peers. Then over here the rubrics were set. Rubrics are a collection of criteria. Now the evaluation of peers was supposed to be done on the basis of several criteria available. They demonstrated the characteristics of a question criteria. So for example one such criteria can be the facts or ideas which were supported by the answer. On that basis it could be judged and for each criteria there could be any number of options. Then we have sample. Then the author would be setting the sample answers for the learner to learn how to assess the responses. Then over here if the author feels that he needs to edit the criteria. So he can click on the author can click on the edit criteria frame in the previous slide and edit it. Similarly for editing the option we had created this. Now the LMS side over here the learner first starts with submitting the response and the learner selects a course and then submits the response. Over here the learner learns to assess the response on the basis of several criteria and over here the answers set by the author are displayed. Then over here this is a peer assessment frame. Over here we have kept anonymous IDs for the peers and the peer responses displayed and accordingly the four different criterias different options are displayed according to the data set in the rubrics. And then the self assessment here the peer learns to assess himself or herself and again the same way the learner assesses other peers and then over here the result is displayed. Now this was the GUI prototype but our second approach was developing a web application so that will be explained by Indranath. Good evening everyone. Like the first part of our project was to develop a prototype which is modeled in Java. This was basically due to the fact that SunBud uses Java and we have developed our prototype in such a modular way that the database handling part is totally modulated out so that it could be easily integrated into any form and that too like we used initially used SQL. So it was designed in such a way that you can easily switch it to Apache Cassandra which is being used in SunBud. So here is the overview of the web application like we developed it only like in a few weeks earlier. So this is the CMS registration page where the content manager manages it like registers then there is a CMS login web page. We used like we there might be a question that why we use these login pages. This is solely for the development purpose like and otherwise you can use sessions to get into that part. So here is a part where you set the prompts for filling up the course and the questions. Here is the setting of rubrics part which the author issues. Next is the LMS registration part and login part. Next is submitting of response like the workflow goes like this. First you start like if we ignore the login part and directly pass on to the session then we check that if the user has already submitted the response. If like he has already submitted the response then we go on to the second checking part where we check if he has learned to properly assess the question. Like it's really important to properly ensure that he has really learned how to response else like it would have no value upgrading. Then we move on to the peer assessment part like after learning to response we check that we check the criteria which have been set by the CMS side for the number of assessors and the number of assessments to be made and then we go on like after he has completed the peer assessment part we move on to the self assessment part where we like if I have any doubt whether a person has assessed me wrong or right. So I will have some like a degree of freedom to judge my grades. Then we go on to the self assessment part and after that we go to grading part. So suppose it happens that my answer sheet my response has been distributed to several systems several people and suppose if they don't submit their response properly or if I am left alone suppose my answer has been distributed to five people. So what's the basic thing that we do is if the person 70% means if the person's response has been 70% graded like has been graded by three or four people then we take on an algorithm to take a median out of those grades and grade it. And second part is auto grading in which we like it's a future work of course it was not possible in this scope of this internship like we can do is use general machine learning algorithms to find out the like the one which we use to find out generate of certain summary. We can compare the previous answers which have been submitted and then compare it to the answer which is being submitted and so according to that we can grade it. So this is all for the workflow. So these are some basic UIs. Finally it displays the result as self grade and the peer grade. The issues that we faced was finding like we didn't start from scratch like in this world there is everything being made. So we tried on integrating something but it was really tough as like Sunbird uses Java. Our primary goal was to develop for Sunbird and Sunbird uses Java. So it was pretty tough to find out and we initially went through the documentation of Open ADX and it was as it was written in Python like we had to go through like thousands of lines of code but we adopted many means but we had to start from scratch. Then was implementing of the database on a cloud server. It was hosted like to make it remote and we made actually we made two methods like one was a web page mode and another was on Java which can be run on offline too. So the basic advantage of running on offline is that every time you have to go for a course you have to go to Open ADX site. So like we made use of a prototype that you can easily integrate like if a school is adopting that method it can be easily run as a offline application too. Second was installing Sunbird portal. As mentioned it was pretty difficult as we had to get to environment variables and API keys. Next is integrating peer assessment tool with Sunbird platform. So this was the main issue that we had faced and like if it had been not faced we would have integrated Cassandra into our existing tool. Then is implementing with Apache Cassandra. Like the major reason to go for Apache Cassandra instead of SQL is that with the recent upcoming of big data like when you have millions of users non-clustered databases work better than like clustered databases work better than non-clustered and here are few points why we should use no SQL methods. Our future work as we didn't like Apache Cassandra was new to us and its concept was new to us. We didn't directly switch to Apache Cassandra because hosting sites are tough to find for it. So our future work might include switching on to Apache Cassandra database. Next is integrating of like the web page which we designed to implement the peer assessment tool. You can easily integrate it into the iframes of the Sunbird course which you will make in future. So it can be easily implemented. We can work on like refining the GUI and like there are many features that open edx aura are using but we are like not that advanced but we can really like near about compete with them. Thank you. Any questions? How easy or difficult it would be to adapt any available what you can say rubric library at least the content of that each component of rubric library should be able to adapt I guess. Sir as basically mentioned like we had to work on Sunbird and Sunbird initially works on like Java so but like open edx initially works on Python and it's frameworks. So like we had to build everything from scratch and for rubric library we have many online tools to create it but like everything is not open source. So that's the major issue. Even peergrade.io mentioned it's freeware but not necessarily open source. No no it's not open source. So that is a very interesting this thing I am not very sure we should we have that free software has its own limitations and usage. Yes sir. Lastly why do you say that some platform because it is built on essentially on Java framework whereas the other one is let's say Django Python there is a problem. Why should there be a problem because if I have a platform which is built around microservices I should be able to add on to those microservices use those API to connect with any damn thing built on any technology. So why do you think there is a problem. Sir there might be problem because like if you use like you will be using some libraries to like Jaitan must be the library by which you can use Python's code to Java. I think so I have not been like. But why do I need sir to make to write a Python code which actually integrates with physically with the code on the Sunbird platform at all. Why can't for example the entire ORA segment be an equivalent of XBlock. Just like XBlock can be written using absolutely any technology but it will work with open edX. Why can't I think of repurposing the existing ORA framework exactly as it is and use it nearly as an add on to Sunbird as a service. Sir first thing while using library your complexity will like grow. You didn't get to play with Cassandra enough that is what you are saying. No sir. Unfortunate because you could have learned very useful things using Cassandra. Did you have at least a peep into SQL and other things. Yes sir like we had installed the system and tried to translate our queries into those but as like it was like 10 to 11 tables. Good work thank you. Thank you sir. Why you are hosting your database somewhere else. Sir as there was no present like free hosting site available so like we found. Local on a machine. Like we hosted it on a machine like to make it like if I make one entry like it's pretty painful for entering answers and stuffs on our own right so like if she enters the answer like I would be easily be able to access it. That's the whole point like we can share on our responses like if we install among like three machines and they have submitted an answer and I have to submit an answer right then it the system must be able to distribute my answer among these two so if it's been hosted on my machine they won't be getting on my answer. That's the basic thing. Sir if we will host it on local server then if I am using it then how will the other person get what I have submitted the response. Just had one question. Yeah just a minute. So you use self assessment after peer assessment right so doesn't that defeat the purpose of having peer assessment. No sir like if suppose I distribute my answers among them and they have an enmity with me and they grade me low but I think that my answer is good so if suppose that there comes the staff assessment part so if there is a huge rift between the scores the staff will get the feedback that there is a huge rift and if you enable staff assessment which is in the CMS part the staff will get the notification and see on that. No but what if for that there is the self-assessment. I don't think he knows what the assessment is. Okay. Right but since it is being done immediately after peer assessment. Yeah but peer assessment. Sir the whole thing is around. Okay so the final will take into care the peer assessment also. Okay but I just wanted to add one more comment on the earth so the fact that they have also included other peer assessment tools not just the one that they are using because there are several other studies which were lacking that so they have included not just their tool but the tool that they are using and research those as well. They are still much clear as to what what are your tools to be used to do the wrong work. That's the difference. I just wanted to share with you one very unintended use of peer assessment that we found when we ran this program as a part of the faculty development program and in one module every participant was supposed to upload an assignment and that assignment was to be peer assessed. We got a very frantic mail from one serious minded person. He said that I was shocked that as a peer team I was required to assess one submission and that submission turned out to be exactly my submission with my name changed. So we never thought peer assessment could lead to plagiarism detection which was an incidental advantage. So these temptations occur not only in the minds of students but even in the minds of teachers. Basically it's a human problem, not a student problem or something.