 Mikaela mentioned a number of types of AR solutions and categories. One other way of categorizing it is the type of display that you're using, because it very much affects your experience. And what you saw in that presentation was mostly video-based. We used a camera to capture something around the environment. The computer superimpose graphics on that, and then presents you with a video image with that composition. And I'm going to show you one example that is more optical, where it all coexists and you understand when I get to that. So my passion for AR and user interfaces come from about a decade of work in different research labs where I focused on a new type of interaction technology and interaction techniques where it starts with looking at how to develop new display systems and display components, various forms of sensing, mobile devices, and physical objects in the surroundings, and then use that new technology to build new forms of interaction techniques and user experiences around that. So today I'm going to talk about one example that has a specific use case, and I want to start with a question. The question is, what if you had and you were able to go to the store right now and buy a thin piece of glass that is completely transparent that you could put really anywhere of your choice in your workplace or at home or really wherever. And this piece of glass would be able to interpret what's behind it. So you're looking through this glass and it would actually understand what's happening on the other side and be able to overlay and superimpose information in real stereoscopic 3D that would help you understand or explain things to you that you cannot see or you could not understand just with your naked eye. So think about that for a second. And the important thing here when I'm starting out with this question is it very much focuses on the user experience. It's not about the technology. It's not my flashy phone or my flashy tablet. It's really what do I want to solve or what do I want to achieve or experience. And it doesn't matter whether that's a handheld device or, like in this case, some projectors that are installed. But we took this vision and maybe a little bit less sexy use case. So this is a big lathe in the Department of Production Engineering at KTH in Stockholm. It's a two-ton machine that's very complex. Not everyone has this in their homes and probably won't for a long time. So a lathe also known as Svarv in Swedish consists of some basic components. You have a tool that can move in 3D inside this workspace or a work piece that spins very fast and this tool hits this work piece that cuts into this material and is able to make it different shapes. I think some people might have in their high school made like little light stands. That's kind of a typical thing. So for lathe, there's an industrial lathe, there's always a control computer that you can load with a program that controls how fast things are moving, where they're going and basically a set of instructions to sort of carve out this material. And this control computer not only steers what's happening but also reads out a lot of information. So rotation speed, pressure, temperature, a lot of information that is essential and sort of adjusts what's going on. And the third component here is the human, the operator. And the operator is very interested in what's going on inside this machinery and wants to get all this information from the process to the left. But at the same time, wants to see how things are moving. For instance, if you're cutting too fast with too much pressure, the tool might break, for instance. So this is a classical AR dilemma, sometimes referred to as schizo-vision. You have to look into place at the same time, very hard, at least with normal vision. So what if we could take this process information and really fuse that with the workspace so that this operator would just have an augmented experience that did not have any conflicting scenario. And our Aster system that was developed with KTH is based on a piece of technology that allows us to do justice. And this is based on a holographic optical element created by Jon and Gustav Son, which was my colleague back then. And he spent many, many years of research into the optics of this. And this holographic optical element is very interesting because it has a number of properties that really allows it to be used for very, very diverse use cases. So essentially, this sheet of glass with a very thin film on top of it that has been optically programmed to bend and shape light that hits it. And we can program this sheet. It's a one-time process. We can program this sheet to reflect light such that we can create the illusion of 3D inside. So the first thing that we can do down here is to separate how the light from multiple projectors that illuminate this sheet, how it comes back towards you as a user. So the first thing is that it allows us to use two projectors shining at the same place, but the left eye will only see the light from one projector and the right eye will only see the light from the other projectors so you can create the illusion of depth with a stereoscopic vision to create two imagines. Another thing that's critical for AR is that you have this transparency. So this element has been made to be wavelength dependent, so it reflects red light very brightly, as you see here. But the rest of the screen is completely optically transparent, just like you used to in the real world. And at the same time, it also provides full resolution. So you might have seen some of these autoscopic displays that are on sale right now. Actually, the more views you have, so if you have one person, you actually have half the resolution for each eye because you're competing for the same pixels, whereas in this case, it's taking care of optically, so you have full resolution for one eye. And the last property is that it can also be controlled in terms of viewing angles. So unlike a mirror or any other material where you shine light and the incident light comes out at the same angle, this can be controlled such that you have a pretty steep projection angle, but you have a very comfortable viewing angle straight on. So we took this optical component and mounted it integrated with the safety glass of this lathe. And then we built a system that interfaces with the lathe without all the real-time information, feed that to the projectors that illuminate this HUE and create a complete fused experience. I'm going to show you a video of that in action. So what you're seeing here is all the red graphics is from our display. And what you just saw is the tool moving in 3D. And here you have the actual workpiece that we're cutting into. And what we're seeing overlaid is the three components and the resultant of the forces that are actually measured in real-time on the tip of the tool. So as we're cutting, we get a direct feedback on what's going on. And here we have the feed that controls how much the tool is pushing into the workpiece. So the more pressure we're pushing it into, the bigger the forces are and vice versa. And there's, of course, other information that we can superimpose in 2D, like the program running the various parameters and so on. And one thing that is also not evident from the video, of course, is that it's really stereoscopic. So you get the real depth experience where the forces are really following the tip of the tool in the 3D space. And this is actually interesting because if you're imagining that you're cutting inside or behind something, you still have that stereoscopic cue with the graphics you see where that is a way of getting extra vision with this. Another aspect of this system, and it comes back to the properties of how it's designed, is that it really was made to not interfere or alter the way that this operator works with his machinery today. As you see, it sits on top of the safety glass. It's fully transparent. If you do not want this experience, you just step out of the viewing zone and you have, again, your direct, clear view. Or you turn off the projectors and, again, nothing interferes. I think that's one of the key components that we forget sometimes with AR that we're sometimes just altering our experience, getting augmented reality through a small mobile viewport down through the resolution of the camera and through the capabilities of the camera when we have a very rich environment around us. But not everyone has the possibility to install something like this. Other things that we are and have been exploring in this concept of this astro-display is to add ways to interact with the content. So we've integrated touch overlays so you can also interact through the glass, through the cameras to sense gestures. This is a system that predated Microsoft Connect, a company that Microsoft actually bought. And eye-tracking to get motion parallax as you're moving in front of this display. This one we're using very nice from Swedish company, Tobi. And we use these devices to explore interaction with, in the top right, we see 3D scan from sculptures that were found on the Vossa battleship that are being scanned for preservation. So what if you could still visualize and interact with them without having to expose the real artifacts or you wanted to see them in multiple museums at the same time. And in the bottom picture we have a surface reconstruction from a medical CT scan. So how could we interact with that without having to touch anything in a sterile medical environment? So this is one example of a hybrid user interface where we have a number of things that we're combining to achieve a transparent or invisible interface as a user experience. So the first thing is bringing in novel display technology that allows us to think a little bit differently about what we can do and what we can experience. The other thing is to push out the sensing into the environment. I don't want to wear necessarily something on my body. This is very intrusive right now. It's kind of odd to have this little thing hanging in front of me. Really, in 2011, Kent, we have a system that can listen to me and track me and amplify me to the room and to the internet without this little boom. It's kind of weird. So what if we could put all the sensing in the environment and then take these sensors, these displays and the interaction to have this combined transparent experience. And actually, a lot of this work has been ongoing in the research community for about 25 years. So in a lot of the pioneering work in augmented reality, ubiquitous computing, tangible user interfaces, spatial warehouse, people have already explored this. But I think that now is the time where we actually have off-the-shelf components, things that we can pretty easily integrate and build on. And also, we have this infrastructure and connectivity that was really not available back then. So it's very exciting from that point of view. And in our work, we're continuing to explore these hybrid user interfaces. Some use cases are to enhance interaction with existing displays. So like here, we have a mobile phone that's tracked on a rear-projected display that has limited capabilities. But a mobile device actually has a better screen, high-resolution screen in its density so we can sort of have a movable viewport to amplify our experience there. At MIT, we just launched this project where we're tracking a mobile device through a novel sensing capability. We were looking a lot at sensing and augmenting objects on tabletops. And of course, for collaboration. So how could we really amplify the collaboration and communication between users across distances with different devices and so on? For instance, in tele-radiology, having surgeons giving second opinions from hospitals that are distributed all over. So coming back to this question, what would you do with this transparent piece of glass that I could understand and interpret and visualize things for you? I'm sure you have lots of ideas. This was just one example that I showed you with the industrial machine. And I would really love to hear from next two days if you have ideas or questions or comments around this that you would like to share. So I thank a lot of collaborators and funding agencies, of course. And you can find a lot of my videos at olval.com. And thank you so much.