 So, again this is part of the bird's eye view I want to talk about hardware um. So, one thing to talk about is displays. So, again we have talked about screens for oh I should say visual. So, examples of visual. So, we could have screens like maybe going back to cathode ray tube monitors right. I think you are old enough to remember those at least right um LCDs, plasmas I do not know organic LEDs and so forth. And maybe you might want to involve lenses in some kind of optical system if you are making a head-mouthed display out of these um. You can get a little more exotic and do um virtual retina display um. There is a lot of work on this going on in industry a lot of it let us say behind closed doors. So, a lot of research a lot of prototypes being developed with virtual retinal displays essentially just take a tiny projector and just project images directly onto your retina, but of course they are still going through the lens of the eye. So, it is not exactly the same as you would think, but at least you are not trying to just simply look at some pixels and deal with whatever the relationship is between that and how it falls onto the retina. You are directly considering what information should be presented to the retina. So, that is one possibility um is also an interesting and exotic family of um displays called light field displays. You can research these on your own light field displays I will not cover these things. In this case um rather than presenting the information on as a two dimensional projection, they try to capture the entire field of light within a three dimensional region. And this is very helpful so that when you rotate your head and your eyes are translating through space or you move your head around inside of a cubic say volume, then um you can adjust the information that is actually presented to your retina um in terms of where your position and orientation are in that space. So, you capture the entire light field using something like say a planoptic camera array or some other kind of device it is also very difficult to do. And then present the entire information in some way so that um you can in real time take into account these small motions and present the entire light field. So, that is another possibility. And I guess somewhere in here I should have mentioned um let us say projectors I will put it up here by screens. In fact, I probably should have put it first um we have had movie projectors for a long time going all the way back to the beginning of film like the like the train pulling into the station from 1895. So, maybe projector should even be first there right um for audio. So, that is visual just think about displays again. So, audio um speakers or earphones which are small speakers um you can fool your audio sense by bone conduction as well right some bone conduction audio techniques um touch. So, there are what are called haptic devices and um you know I will not have time to cover this these in the class, but um couple of general categories which are worth considering about is um cutaneous versus kinesthesia. So, nice way to divide these up um an example of the first one is maybe I just get some kind of vibration as a kind of feedback right a rumble pack in a in a in a game controller would be an example of that or maybe my emotions actually get blocked I feel a resistant force um there may be a robot on the other end pushing back at me right and providing the right force that feels like I am touching a wall I keep getting stopped every time I put my hand out right, but maybe a robot hand that is touching mine let us say um. So, that would be an example of kinesthesia. So, um so you can provide this kind of feedback as well um and as far as going into smell or taste or vestibular stimulation I guess I will not include too many examples of that, but there there certainly is ongoing research on those as well providing the right um um chemical feedback um to trick your your sense of smell or taste and there is electrical stimulation of the vestibular organ that is possible, but I do not recommend trying it. So, it is um very painful from what I hear alright um other hardware you might think about inside of hardware other hardware of course, we have computers on on use and we have graphics processing units let me just kind of you know just to kind of list off everything computers and GPUs inside of there. So, let us say CPUs and GPUs and I should also say that there are inputs and in this case um maybe something like a game controller keyboard and we talked about data gloves been around for a long time. So, a lot of interesting research and development going on on this part by the way input devices for virtual route is a lot of open work to be done there because once you put on a virtual reality headset it is like a blindfold and probably the last thing you want to be doing is just feeling around for the keyboard right. So, how can you do things like suppose you want to sit and write software inside of virtual reality you just want that to become your desktop right. So, how do you do that how do you enter data fast enough and in a comfortable way and interact with your environment is it going to be a keyboard and mouse or is it going to be something else um one more bit I want to say about hardware is what do we use for tracking. So, tracking is going to become very important as I talked about in the case of a head mounted display when you turn your head you have to get the visual information to update correctly. So, that it really looks like um the virtual world or this alternate world is responding appropriately to fool your brain. So, tracking becomes very important and the more portable the device gets or the closer it gets to um the actual sense organ the more kinds of tracking you need um if I am in a cave system I can see my entire body as I walk around right. If I put on a virtual reality headset and look down I do not see my body. So, I have to invent a body how do I do that well I can track my body and then bring that into virtual reality, but that is a lot of work it is very challenging. It is reasonable as a class project, but it is very hard to make it um 100 percent reliable and accurate. So, these things do not work very well they look great they make great demos, but very very hard to do that. You could put on a motion capture suit a full body suit with markers gets a lot better than, but if you are going to do that much work why not just build a cave system you know. So, it is a very interesting kind of problem. So, I want to talk about tracking and this will finish off the hardware where is my chalk ok let us see I am leaving it there alright um tracking. So, you call this tracking or estimation or filtering. So, people in electrical engineering will tend to use words more like that um one of the most useful components is the inertial measurement unit people call IMU. So, IMUs have been around for a long time they were originally designed for navigation um especially useful in aircraft and spacecraft. So, if you fly on if you flown on airplanes before almost certainly there was an IMU on board usually a rather large mechanical device um. So, the purpose is to provide orientation estimation. So, these are used for orientation orientation rotation if you want estimation right. So, that is their main design what it measures is 3D angular velocity and 3D linear acceleration. So, the um the angular velocity measuring part is usually called a gyroscope sometimes people refer to the entire inertial measurement unit as a gyroscope I think that is not quite right and this part is called appropriately an accelerometer. So, this measures linear acceleration, but it is also measuring acceleration due to gravity because there is no way to really separate true linear acceleration with respect to the fixed earth and gravity in a natural way there is no natural way to separate that you are always measuring the vector sum of those two it is as if we are all on a rocket ship right now accelerating that way at 9.81 meters per second squared which is why we are all stuck to the floor right. So, that acceleration is always being measured as well right. So, we have the inertial measurement unit the interesting thing that is happened in this technology is that our smartphones have IMUs inside mainly so that you can play these apps where there is a ball rolling around um or I guess it is designed so that it tells you whether or not the screen needs to be reoriented which frequently seems to fail and it very often fails because you are moving while orienting and this linear acceleration of the device is getting confused with gravity is all kinds of issues like that, but um these started appearing in smartphones they became very very cheap mass produced on the order of hundreds of millions and now you can just take those and put them inside of virtual reality headsets and you have an IMU to use and does not cost very much money at all right. So, it can be mass produced you can also put them inside of earphones as well. So, this is great you can stick them on robots you can put them all over the place to estimate orientation. So, that is one of the greatest enabling technologies and one thing that is very closely related to the inertial measurement unit and is sometimes considered a part of it is magnetometer which measures magnetic fields 3D magnetic field you can sort of think of it as a compass, but it is not exactly a compass it is going to measure the vector sum of the earth's magnetic field and whatever other magnetic fields are around in the building let us say or on the circuit board that contains the magnetometer. So, it is just generally measuring magnetic fields that is also used to provide orientation information. Other hardware um cameras have become quite useful you will notice on the the DK2 Oculus DK2 um used in the lab there there is a there is a camera. So, that is being used as part of the tracking system as well and when you use cameras there is generally two different kinds of ways of doing tracking the names are inside out and outside in um honestly I I got these I found it very confusing it took me at least a few weeks to to keep them straight. So, I I do not know, but these are the terms that are used um. So, in the inside out case you are this is a top down view you are wearing a headset and there is a camera on it here let us say and then there are some markers in the physical world I will draw markers as pluses features or markers. So, they see these markers and so, based on how these markers appear to the camera that is fixed to the headset you can figure out where you are at. These markers could be engineered or it could be using natural features from the environment. If you engineer them yourself then you get very accurate very reliable performance. If you just extract them automatically it makes a fun project, but it is an interesting demo maybe, but it is very hard to get perfect accuracy and reliability or very very high accuracy and reliability like this, but nevertheless either way this is inside out. The other way is outside in which is what happens in this Oculus Rift DK 2 case can you have the head looking straight down you have the headset on and then there is the camera out here looking and. So, these features draw them in another color here. So, we see them are on the headset itself. In the case of Oculus Rift DK 2 that you will use in the lab these are infrared LEDs that are embedded inside of the headset. The plastic is infrared transparent, but in the visible spectrum it looks black. So, you cannot see the LEDs, but they are inside of there. So, in this case it has markers. So, this is outside in other words the camera is outside looking inward and in this case the camera is inside the headset looking outward. So, these are ways to use cameras and we can talk about the technology of doing this tracking later in the course I will spend some time on that. And of course, you can add to the camera's depth information. So, you can have what are called you know including depth like information provided by the connect using current technology to give you what are called RGBD sensors right giving you color and distance information. So, there is cameras for that and those are going to be very important in the coming a few years as the cost of those goes down the reliability goes up the accuracy improves. There is going to be more and more technologies that use depth cameras for various kinds of tracking. So, you put all this together there is also more methods that use electromagnetic fields or magnetic fields that are generated by a base station and you can do tracking that way. One thing to think about throughout these tracking technologies is what do you need to track? Track your head, track your eyes, you want to track your entire body, you want to track an entire room full of people right all kinds of things you think you want to do and you do not necessarily have to do that. For example, I am giving a lecture today and I can only look at one of you at a certain at a time right. If we were all in virtual reality doing this lecture I could be looking at all of you at the same time that each one of you would see me looking directly at you right. So, you do not need to have my head tracked perfectly I need to have my head tracked perfectly I guess to have the display work correctly, but I do not need to have where my eyes are looking tracked perfectly in terms of what you see I could just make it so that I am always looking at you so that you are always paying attention to me and you get the feeling individually that you are being looked at, but really I cannot possibly be doing that. So, there is cases where you do not want to provide the exact information that is going on in the real world. So, you really have to think about these things carefully and there is always a cost involved and accuracy issues and comfort issues again. If you are tracking the body and it looks like my arm keeps breaking and bending in a backward direction that is not comfortable right it is torture of some kind right. So, if your system is not working reliably it is probably not worth doing or you have to assess is it really necessary. If it is then throw all the resources you have into it and make it work, but it might not be even though it might look cool might not be necessary. All right. So, that finishes the hardware part. I want to talk about the bird's eye view of the software. Are there any questions about hardware? Again this is very high level overview I am going to go into more details of these things. In fact, I am covering the fundamentals. So, by giving these examples you will be able to see more of why I am jumping into various fundamentals.