 Hello everyone. I hope you guys had a great day yesterday. You get to meet a lot of UX designers, right? So that's a great opportunity. So the first thing that I'm going to discuss is like decoding human emotions, right? When we do talk about human emotions, right? The first things that kind of come to your mind is like how you interact with people. So yesterday you kind of interacted with a lot of UX designers, right? So in that process, the first thing you do is kind of like talk with them and kind of analyze their emotional state. So that's what you do. Like based on their emotional state, only you kind of continue the interaction, whether if they are like not interested, you kind of change your discussion, right? So that's the first thought process that kind of comes into mind. It's continuous. You would be kind of continuously analyzing their emotional state whenever you are trying to interact with someone else. So that's what the emotion, like how AI would kind of solve that problem, like do that. So to kind of begin with what is emotions? Emotion is nothing but how it's kind of like a mental reaction, conscious reaction to external events, like especially whatever events that you face, right? So based on that, your reaction would be, that's the mental reaction. So why emotions do play as a UX designers, right? Why it's important for us? Why this we have to focus on emotional aspect because emotions kind of drive user behavior. They kind of drive engagement. They are focused on feedback because once you understand the emotions of the user, you tend to get more feedback of what's happening with your product, right? So these are the elements that kind of help in the decision making process, right? If you, your emotional state depends on the decisions that you make. If you are happy, like if you are happy, the decisions that you make would be kind of impulsive. You would be making more purchases if you are online. And if you are in a height and stressful situations, you won't be making those purchases because you kind of have a lesser trust on the platform because that's what human emotions are. So how AI decodes emotions, right? So the one way, right now it's kind of like detailed aspect. So there's something called a spatial coding system. We call that as spatial coding system just like information architecture, right? Where you kind of categorized information based on themes. So facial coding system is something similar to that where they categorize emotions based on your facial muscle movements. So each muscle movements would be categorized and those categorizations would be connected to that specific emotions that we are talking about. So in this situation, when we do talk about happiness, sadness or something, the facial, they have assigned a specific facial muscle and that specific facial muscle would kind of be considered when we are trying to analyze that facial features, right? In this situation, take for example, in the image that you see on the left, it's about Mona Lisa. And the two aspect is one is like lip polar and cheek razor. So you could see the cheeks are being raised here. So those two aspects are the action unit and action descriptions are something that describes the facial muscle movements. So that's based on that in situations like if you are happy, right? There is a situation like you get an increment and you win a million dollar lottery, right? That's the happiness scale. So based on those different happiness, right? There is something called as intensity score. So intensity score would define how happy you are. So that's what intensity score is based on and that's based on these three factors. Facial coding system works. There are like how, what kind of emotions this AI kind of analyzes, like they are able to get from human beings are like sad anger happiness. And these are like generic human emotions that it can capture. They're like much more than that, but these are like most common things that it can understand. So issues with existing facial emotional recognition technologies, right? One, it doesn't get an understanding of context. So the image you see is Ronaldo kind of scoring the goal. Okay. After scoring the goal, the facial expression is more of, if you consider facial coding system, it's showing as angry because it kind of comes into EU 7 and EU 14, which are kind of job being dropped and eyes, eyebrows being raised, right? So that kind of considers that as an angry person. But in real world, it doesn't consider the context of that person. So that's what's important like in this situation context and culture because cultural situations would change and the emotion of varying cultures, right? They are different. So there is another situation where people do tend to fake their emotions, right? So the emotions that you see here won't be the actual emotions that they are having, right? So in that situation, the facial emotional recognition technologies is of no use. So how to overcome that? What will be the future, right? The future is going to be understanding the pupil dilation. The pupil dilation is something which is connected to autonomic nervous system of our body. Like it kind of controls your bodily functions. So you can't fake that. It's kind of directly hardwired to your system. So it will respond whenever you are facing some sort of changes in your external events and it would connect to that specific emotions. So in this situation, you see a dilated pupil which is kind of like, kind of like correlates to different situations. In this one, we can assume that that person is happy and is in a calm state. So that is something which would kind of be an add-on feature along with the facial muscle that it kind of like analyzes, right? With that, you have to use pupil dilation to kind of understand the human emotional state. And you could see the left one is a dilated and the right one is contracted. So what's happening in the contracted situation is the person is kind of in a calm state. Whereas on the left, the person is filled of vision has expanded. He is in a more stressful situation and he is aware of what's happening in the field of view. So that's what's happening. But there is also some issues that kind of occur in this one which I will kind of explain. This is something, an explanation by a senior German scientist who explains that pupil dilation can betray human emotions. Human, even before you know what your emotions are, right? This would kind of prove what emotional state you are. So that's what pupil dilation would be useful in future understanding of facial emotion. What, like as a UX designer, what it would mean using these technologies, right? So the second most important, like how it would enhance the user experience is like assume you are using Spotify. You are not in a good mood. What you would do? You would take the action of looking at calm or like serene music, right? So what these facial emotional technologies would do, they understand your emotions right away. Just by looking at the camera, we have infrared cameras, right? That would analyze your emotional state and kind of recommend you specific music. If you are using a Coursera app, the Coursera app would understand whether you are like engaged or like not interested in that specific courses. And the same applies for mental health where mental, if you are taking some medications, the mental state of that specific person would be detailed to the doctors. And the same applies for meta where they kind of deliver content based on your emotional state. So that's it from my side.