 In this video, we will talk about Effective Computing, there will be two parts. In this video, what is Effective Computing, let us cut interest to that. Effective Computing, this particular slide is basically based on the papers by Sidney D. Mello and by Professor Rand Baker and also some of the content is from the iMotion software blocks. So, Effective Computing or sometimes called as Emotional Intelligence or Emotion EQ, nowadays people are talking about you have to know just IQ, EQ also has to improve everything. So, what is this Effective Computing? Let us see a very brief introduction. If you have watched the movie 2001 Space Odyssey, there is a AI called Hull, some of the different versions, I think stall 200 is what is in the spacecraft. In this movie, a bunch of scientists travel to space for mission and the whole space is controlled by a computer, actually AI. And interestingly, this AI reads what people are speaking based on the lip reading and everything, facial analysis. And it also has people, are you confused? How are you feeling? Those kind of questions. This is bit, it is giving emotion to the missions. We always, the movie directors and scientists always thought about, wrote a book about robots, but giving emotion to the robot is kind of a fast movie. By the way, this movie is the list in 1968. So, what is Emotion Detection or Effective Computing? So, let us see, if you want to develop a system and a device or kind of system, the purpose is to recognize emotions of the participant, that is the humans and interpret. So, first you have to recognize that there is some emotion going on in the face and interpret what is that emotion and why it happened, interpret. Then you have to analyze why it is happened. Then you have to simulate based on that emotion, you have to speak in the same tone or you have to give feedback. So, the artificial intelligence mission or the agents in the intelligent routine systems can show the simulated emotions in the face like a smiling or sad face is possible. So, how to do that in learning analytics? So, this is called Effective Computing, it is widely used in the marketing and also in basically in the management side, they try to understand the people's emotion when they are going through multiple problems and they try to provide some feedback or counseling that. So, it gained a huge popularity in Effective Computing after a technical report in 1995 paper by Professor Rosling Picard and she also did a wonderful TED talk in I guess around 2000 or something. Then please check it if you get time, so it is interesting talk. So, let us think about it, how do we express emotions? You know, how humans express emotions. Think about it and write down two or three ways we express the emotions. If you can just down those ways, then you can say how do you detect emotions when you are speaking with someone? Imagine you are speaking with my other person and that person is expressing his emotion or is expressing emotion, how you can detect that? So, whatever the modality is, whatever the data you can collect to detect that emotion, think about it and yeah, that is the idea. Write down the answers and let me to continue. So, you might have thought about facial expressions, speech, the speech, the tone changes, a posture or gesture, the body, the emotion actually indicated by the posture a lot, the and movements, facial expressions is the key and other things also important. Or also you can use physiological monitoring systems like EEG, EMG or GSR or if you are asking the students to write something in the laptop or a computer, in the environment, they are typing based on the typing speed or the words they use, the mistakes they make, we can also the words and the way they form the sentences also can tell what is the emotion. So, each person have their own signature in forming a sentence. And that is all the channels you can collect data and if you imagine you are in a classroom as student is talking based on the noise they make, the utterances, the sounds, the noise in the classroom, you can say are they bored or they excited or they happy, all these things you can detect based on their noises they make. So, you can collect all this data you know from different modalities, here the modality of expressing emotion is not just a face but different modalities. If you collect all this data and analyze to predict students emotion that is called multi-modal analysis. For example, you can use the facial expressions, then gesture posture and also the eye movements or the voice to combine it to detect the emotions then it is called multi-modal analysis. So, the modality of expression is different. That is why it is called interesting field and multi-modal is gaining popularity. The problem is each senses to detect this modalities like facial expressions, the sample rate is different. You might have a webcam which is 25 frames per second which means you capture the learner's face or the human's face 25 frames per second that is 25 pictures per second. So, that is 25 frames and if you say eye tracker that might give 90 hertz frequency that is it captures 90 hertz per second. Then how do you combine this how do you abstract those information and how do you combine this that is a challenge and there are some works as been done in this field and people are working on its kind of a new field and that is interesting to start with but how to combine and sync is still a open ended problem. So, I would recommend to you to check this tickle interdisciplinary review of models, methods and their applications by Professor Calvo and Sidney D. Mellow and published in 2010. It actually reports data direction, affect detection in multi-channels and they have tables for each modality like a voice only, text based or facial expression based. It is a very interesting paper and we do not find any other paper after that mainly because not much work happened in the field of other modalities other than the facial expressions or human observations. Human observations or directing emotion based on facial expression is focused more compared to other channels of data. Mainly due to we cannot take these instruments out of lab you know we cannot use those instruments in the real classroom settings and the recently there are lot of portable devices are coming which gives us the hope that we can take these devices like EEG, GSR to the classroom and collect data. So, I would recommend you guys to check this paper it is interesting paper. So, let us look at the couple of these data collections one is body language and posture. So, there is a pressure sensitive seats what you see is that it actually measures the if you can put it put it on top of your chair and you sit on it and it measures the pressure points where the students posture changes that has been recorded that can be viewed and we can detect the students you know body pressure posture can be detected and but the problem is not enough studies to prove its performance or improve it or to make it reliable and machine learning methods using the data collected by the you know the pressure sensitive seats has shown that we can detect emotions at 83% accuracy. But the problem is if you try to detect a particular emotion say frustrated or boredom it is not enough. So, you know 83% accuracy may not be correct, you have seen that how it is important is not just accuracy it is a capac core or pressure and recall values. So, yeah it is not really great you can say 83 is good but recall and pressure is not really great in the systems. And the other thing is you cannot take it to the you know cannot collect 30 students data in the real-time classroom or real lab you can do only laboratory experiments. And there are some others like lecromyograms which can be you know portable very like small and this is promising that you can take it to classes or you can take it to field where three or four students are interacting in a lab and they are working on something you can connect these devices and collect data or EEG. This is a bit old even nowadays even simply started with the four channel or H multiple channels to cover EEG data. There are a lot of research going on this field and it is not really costly the devices also getting you know lesser price. It was really huge like high cost when it kind of you know portable things are started coming. So, now it is easy to collect this data. So, going forward you might collect more data using these kind of physiological sensors also EDA. And in fact, most of the smart watches or Maya band you might have seen Maya band discontinued which collects lot of data about you know accelerometer, the rotation everything can be collected based on that we can even detect the students you know posture, what are they thinking, what are they doing all this data can be collected then we can automatically detect students posture, gesture to detecting their effective states. So, these are the some sample physiological sensors I just want to show. And the key part is I mentioned facial expression analysis. We will look into it in detail in the next video, but let us start getting introduced. In a facial effect analysis, we want to observe the learner's face and detect and classify the emotion you know are you and are you know anger or you having fear or are you surprised or sad or something like that. Paul Ekman, Professor Paul Ekman actually did a cross cultural research in Papua New Guinea, Ireland where the aboriginal peoples that is the people who are not really you know came to the modern world, they are still living in the old culture, old tradition, the emotions, the fake emotions not there, they show the real emotions. So, he went to the Papua New Guinea, Ireland and he collected students, not students, he collected the people's face and he did a lot of study, then he reported that based on the emotions from the facial expressions we can detect emotions. The six emotions, the basic emotions he reported is anger, disgust, fear, happiness, sadness and surprise. This is 1960 and he did a lot of study after that databases available, the reports how to detect the emotions all publicly available. Paul Ekman's facial analysis coding system has become very famous. And what Paul Ekman suggests is these emotions, these facial expressions are generalizable and it is not that the emotion by Indian continent people will be different from other country people. People usually think that Asian people show less emotions or they do not show emotion, something like that. I am not talking about South Asian people. Then it is not true. So, Paul Ekman says it is not true, although it is true that if I am observing the Indian participants, I would be able to observe better than I observing participant from non-Indian, like from America or from Europe. So, I may not be able to detect emotions correctly. So, that is kind of established, but the emotion expression is kind of generic, there have been lot of studies conducted and it is proved that. So, do not worry that these reports are for the other countries, not for India, that is not true. It is possible to use those, use these kind of rubrics to detect emotions in Indian students also. So, it is interesting question now. Why do you think we have to think about affective computing in learning analytics? So, I was telling that learning analytics, we also have to collect affective computing, what is the data, but why do you think we have to do that? And so, think about it, write down, list down two or three answers. Please pass this video, after writing it down, please resume to continue. So, you might have got many answers. The main reason is can we understand students learning without emotion? Is it even possible? It is a question. Are the students learning? Is she bored? Is she confused? When she got confused? So, without that, it is not possible. If you are a teacher, you would say no. When you are teaching in a real classroom, in a traditional classroom face-to-face method, you will be observing the student's facial expressions when you are teaching. Based on the student's facial expressions, you might ask them what is the problem, conduct a short quiz, go explain more detail, these things happens. Suppose you see most of students are bored or not focused, you might ask them to stand up, get energy or something like that. So, that is important. Without emotions, we cannot understand what is learning is happening. So, effective computing is very, very important for learning analytics. And emotions related to moods and other parameters like motivations, attitude, interest, but they are not actually equal. They are not equal to attitude or interest, but emotion parameter can impact this motivation and attitude or interest towards the subject. So, understanding emotion is more important to keep the student motivated or to keep the students interest high so that they can spend more time on learning content. So, interaction between effective states and actions in computer learning environment. In order to study the effective states in a computer based learning environment, we cannot use the six basic emotions, we saw that. So, there has been a paper, a couple of papers, we will see them in the next slide. And they try to understand what is the interaction between effective states, what is the interaction between boredom, then confusion, then frustrated. What is this interaction when they are interacting with the computer based learning environment? And here we cannot use the basic emotions only. So, we have to use learner-centric emotions like a boredom, frustration, confusion, and discussed or surprised, delight, those kind of emotion has to be considered. This is called the learner-centric emotions. So, in this particular paper, the method of reporting is student-study reporting, we will discuss that in detail. So, in this slide, we saw what is effective computing and basics and what are the learner-centric emotions, why it is important to consider effective computing in learning analytics. Thank you.