 Any questions about this? All right I would like to go on to the um to the next topic which is going to be um interfaces for virtual reality. I want to give you a high level overview this time and then in the next lectures we can we can finish up. So, we are up to let us say the highest level. Oop I have made a mistake I want to say something about combination of senses um let me back up just a bit. So, I finished talking about audio I have finished talking about visual senses. Let us think about multiple senses now and how all of these come together in virtual reality. So, I want to talk about that first before getting to higher level concerns such as interfaces. Combination of senses. So, what have we looked at so far? We have the vision sense, we have auditory sense, hearing, we have vestibular, we have not talked about rendering directly to the vestibular organ right, but but I am making displays for that, but it nevertheless is impacted by virtual reality. So, it becomes important. We did not spend time on haptic or touch, but that is also very important for virtual reality. I could spend easily a couple of lectures as I did for the auditory case we could talk about haptic feedback as well. And one that gets a lot less attention, but we might as well include it for completeness is smell and taste which provides information through chemo receptors. Let us see and haptic and touch could include a sensing of temperature as well all sorts of other possibilities. So, it is most thermal and sensing of forces along our skin in various places. So, we have all of these senses and within each one of the sense we have cues within each sense right. In the case of vision we talked about binocular cues and monocular cues, we had binoral cues and monoral cues in the auditory case. In the case of haptic or touch we could have cues coming from many different parts of our body right. And remember that we get perception through the combination of cues. So, our brain is making some kind of interpretation through a combination of cues. So, one thing to pay a lot of attention to which is important is that the cues could come from multiple senses and be combined together to provide a coherent view. And that is something that if you combine vision with audio, you have a very powerful set of cues and that greatly enhances the virtual reality experience. So, this is very important to make use of that. If I could further combine that with haptic or touch if I give force feedback for example, perhaps my virtual reality system is being used to train doctors right to perform surgical operations and they want to feel the resistance of the materials as they use instruments, they would have touch feedback in this case right. So, that would be combined with vision that would give them additional feedback to enhance the experience to make it more real and make it correspond more closely to what would happen in the physical world. There are interesting combinations here. For example, if I take auditory and vision and put these together there is one very well known effect that helps explain how cues from each of these combine to give us perception and this is called the McGurk effect. So, if I do the following experiment I show you a video of someone speaking words or in this case maybe particular syllables of sounds. And I also present audio of that at the same time and I put them together. In the real world these are consistent right. So, if I make a a ba sound you hear ba and you see my lips coming together to go ba. If I go ga then you do not see my lips, but I can make it so that the auditory and visual parts are in conflict. It turns out this was shown by McGurk and other colleagues have done similar things that if you change what you see and what you hear your brain will construct something different from that. And so, one of the very simple experiments they did was the spoken part was ba. So, you hear ba you see ga and then you swear that you have heard da some other sound all together. And you can even tell people of this effect you will say ok you see the McGurk effect this is what you are going to experience you tell them this and they still cannot prevent their brain from hearing a da sound. So, it will still seem to you that you have heard a da sound because there is a mismatch between the vision and auditory part. So, you can go off watch videos on this it is a fascinating effect. So, sometimes in an optical illusion I can tell you about the illusion and you go ok I do not see it anymore I see what is going on right. Some kind sometimes not and the McGurk effect it happens at such a low level. So, your perception of what sound has been made by the person speaking depends on the combination of vision and auditory that you will swear that they have said something that they have not said just based on faking the part. So, people who have difficulty in hearing of course, read lips to enhance the information that their hearings, but we are doing this all the time even if there is no difficulty with our hearing. And it gives us this overall picture of what has been said you think it is just purely your ears, but it is actual visual together with your ears it gives us the perception of what has been said. So, that is fascinating. If we go off in another direction we have these senses where virtual reality is generating a conflict between these every time we grab on to a controller and we move the avatar forward while we are in the real world just sitting in place. So, that is a vision cue that is causing us to perceive acceleration forward and the vestibular organ knows that we are not and that ends up being a fundamental source of simulator sickness right. So, very important to consider how these things come together. Up here we have I described a mismatch called the McGurk effect where it just causes you to perceive a different sound. Here I have a mismatch where it might cause someone to vomit right. So, very interesting to study these and understand the effects of these. Now, I could try to go and put all three of these together as I have said. And so, when I move my character forward I may generate sounds that correspond to that maybe I will change maybe I will hear some wind and I will change that as I am accelerating right. So, I will change the sound that is hitting my ears I will present the right visual stimulation the vestibular sense will still be in disagreement with that. So, what happens in that case is it like an election maybe these two very strong senses out vote this one and your brain just accepts it finally. Or no maybe it continues to cause discomfort due to the conflict experiments need to be done on these things. And I have not touched on this subject as at all right, but we can bring this into the into the into the combination of senses as well and taste and smell as well. And again talk about what happens when these cues are consistent in VR you get a more powerful experience when they are inconsistent in VR. You get maybe some level of disbelief maybe some level of fatigue because there is some discord between these senses maybe there is just a temporal shift due to latencies that again might cause some kind of fatigue or disbelief. And ultimately in some of the worst cases there is sickness which is the very well known case of vision and vestibular being in mismatch which we called vexion. All right questions about that so that is the extra part I wanted to add to this.