 most of the users of the audience that we have here are from the designing background, or is there anyone from developing background, or any other backgrounds as well? All right, cool. So I'm mostly from the development background, so designing, I won't say it's my niche, or I won't be able to guide you how to go about doing the UX for your application, on which like Shamward and like to more on that, but from my experience I have my fair share of experiences working for 3D applications or getting the designs. Previously I used to work in a gaming studios where it's all about how can you make the application more user-friendly and give the best experience or the delightful experience to your users. All right, so the topic for today is device agnostic UX UI for AR and VR. So before we get started with the slides and presentation, I just want to like get some hands on the people who have worked with some sort of like 3D application development, I mean, doing designing for 3D applications or AR or VR, quite a few. So as expected, and it's a very common, wherever I go, I mean I see like very few hands getting raised for the designers who have worked into this 3D domain that how to give the best experience to your users when they are interacting into the 3D world space, which is relatively very much different when you do like normal 2D application programming, all right. So before we start with that, like little bit about the company, imagine it that me and my colleague Sham work in. So we have been like five year old company based in US and India. We have built the first augmented reality simulator for Indian army and few more things and few recognition that we have is like our product that led us to, you know, selection in MIT, TR35 group. Then we have born like type on 2017. Then recently, when we have been like selected for the case startup challenge in South Korea and plus few other things as well. And then both we have Hemant and Sashira D as our advisor and investor. So getting started with the presentation. So before we get into the development side of it, we should have a very clear vision of what actually AR, VR and MR means and what is the difference between these things. So I mean, if you're familiar with, I mean, it's actually very commonly used on these days if you're following the media and the latest news, these terms like everyone is trying to, you know, get some sort of AR or MR or VR solution for their company. So if it's an, you know, any sales company or they are trying to, you know, or some like, some companies in the real estate they want some VR simulation or the application where the user could, you know, experience the virtual, the entire development in their product prior to actually they build it. So that is where they go for the VR. AR is like, it's more into the manufacturing industries and your maintenance, those things. So are we clear with these definition or should I like put more light on this? So yeah, that's a good thing. So VR, AR and MR and actually people most often like, you know, use the MR and AR term interchangeably but they are contrastingly different from each other. I'll explain in what's in. So virtual reality is something which actually breaks your connection from your current actual real world. All right. So like I'm standing in this world space and this is my real world but let's say with the help of any sort of medium. So it could be, you know, some contact lenses or some headset that I wear and that takes me to a different virtual worlds where, you know, I could have a waterfall falling from here, few animals, butterflies flying around and have a present environment or, you know, for my working space I could have, you know, rather than my usual boarding office desk I could be working from, you know, the Alts Mountain or anywhere and have a quite chilled environment. So that is what it breaks through my, you know, current, I mean, present reality and takes me to a virtual world which is absolutely different and has nothing to do with my real world. So that is what the virtual reality is and few mediums that we have is the headsets that we, I mean, most commonly we see and few development are also being done in terms of the contact lenses to, you know, you display some virtual informations right onto your eyes to give you an illusion of those things but that doesn't actually come, I mean, no development has been done for the VR but that comes categorized under the AR. So AR is if you see that the pilot, the headsets that they wear there's some information, the real-time information that is, you know, get displayed on their helmets. So that you can call that as an AR. So AR is essentially just another layer of information between you and your real world. So I could, you know, though, like if you have seen the Voforia or AR kit applications what they do is you use your phone to, you know, visualize some digital content in your space and you just see it. I mean, it doesn't interact with the real world but you are able to just see another information. So these AR applications, they become relevant, you know, for the cases, let's say, like the one very good use cases with the Google glasses that we had, right? So whenever I'm traveling, so for the navigation, I don't, you know, I just need to know which direction to go. So I could just have a pointer that tells me which direction to go or let's say on my windshield, on the windshield of my car I could have some essential information or for the navigation that could be displayed on top of my windshield glass. That's an example of AR. But interesting thing comes when we move to the mixed reality. Mixed reality is the combination of VR and AR. In VR, you have the complete control of your virtual world, right? But in mixed reality, what if we give you the same control for your real world as well? So like what we can do with mixed reality is I could have, you know, a character just point up here and let's say if I, you know, push it around, it will fall from that, this table on top of the ground. How cool will that be, right? So few examples for these mixed reality that you could see is like, let's say if I'm a maintenance engineer, right? So I have some issues with some of my mechanics or some parts that I'm not able to, you know, I don't know, I'm like, I'm just the beginner, not beginner, but the very new, I just came out of my apprenticeship and I started working out. I cannot do that. So what MR could do is I could actually go, the device would scan what environment is surrounding me. I could identify what engine is that. Once we have that information, we could overlay that engine on top of the actual engine and I could see actually the SOPs for those engines. So let's say there's some repair mechanism that needs to be done and I'm not like very, I mean, at the moment I'm not very thorough with all of the SOPs for that, you know, what are the standard operating procedure for fixing that issues. I could have that model over on top of the actual engine and it could show me the animations, like how to fix this thing and that's really the very convenient way to go about it. So that's where the mixed reality comes. So this is like a good illustration that you could see. So VR, AR and MR. So you see that alien character that is inside the virtual world which has nothing to do with your actual world. But if you see the AR, the alien bot is getting overlaid on top of your actual information. So you could see that this sofa that you have, it has nothing to do with the alien character that you're seeing on top of your screen. But inside the MR, you get some concept of your spatial awareness. So the model has some depth information itself. So it knows that whether it's behind the sofa or it's on top of sofa or wherever it is in that, you know, the virtual space with the information of your actual world as well. So coming to the device agnostic solutions, right? So right now, Imagine it has actually taken a very interesting challenge because right now people are even struggling to find the proper UX design for any particular device as well. So if you go for one application to another application on the same device, you'll find a different interface between those two different application. But what Imagine it has done is it's actually building a platform which you can access using any of these, you know, different variants of devices and somehow have a coherent UX for that particular platform, right? So if you see on, you know, the leftmost side, you see some mobile headsets that are very commonly used, the cardboard, gear VR, and metal. And so these are like, so these are just for the visualization purpose. So you are restricted in terms of, you know, the interactness or interactiveness or the modes of interaction with the application as you'll find in terms of hardcore VR glasses. So if you see, you have HTC Vive, Oculus Rift and PlayStation VR. So in HTC Vive and Oculus Rift, you have got the hand controllers, you have got set of trackers that tracks you, you're positioning the real world and you get that information, the virtual world as well, those things. And on the rightmost side, you've got AR glasses. So AR glasses such as this Microsoft HoloLens, there is MetaIO, MetaIO is the third one and in competition to HoloLens. And this entire gamut of devices that you have. So how do we go about it, right? That's a big problem that we have actually. And believe me, so far we have the monitors as a display of all of the digital information in the coming year, like five to 10 years, that screen will be replaced by the headsets. All right, so people actually, you know, sometimes consider this as, you know, just a techie term or, you know, just an entertainment thing that, you know, you can visualize and that you can wear and visualize something that is not usually you can see, but it actually has more real life implementation relative to your entertainment industry. I mean, it has in both, but real-time experiences, you know, for data visualization, your manufacturing, your, you know, training, all of those things, everywhere you have implementation for these things. So now I guess we are like very clear with the AR and VR and the difference between two. So it's like small interactive session. Like you have to guess which one of, you know, these two images are, is AR and which one is VR. All right, so how many of, I mean, they say can I get some hands for, first one is VR and the right one is the AR, all right. So rest of the people think this one is VR and that one is AR, all right. So most of the people are incorrect. And so that one is VR and this one is AR. It's from like very popular VR arcade, the Void. They have like good, I mean, arcade centers, I mean, across the globe, in New York, in London. So you can experience and, you know, VR the headset and actually when experience the virtual world along with, you know, some sort of, they have set up a stage. So you don't feel like totally disconnected from your real world. So when you're shooting a person is there, but if you have a building or cliff here, you'll actually have some, you know, some obstruction for the experience of that building. So that's an interesting thing. And on the right-hand side, you see, it's an example of an AR where you see a small kid holding an elephant on top of his palm. So this is very popular image from the company, the very secretive company, magically. If you know, I mean, they have been like, the top secretive startup company so far, they have raised about like $1.5 billion without even actually having any product into the market in the seed around that too. So I guess most of you are now having the clarity on it. So next, one more thing. So there you see Shyam holding a tiger. And this is a good application for the designers actually. So which one of you think that the, you know, left one is an AR, how many of you think? All right, cool. So this one is a tilt brush. All right, all right. So little about the product that we are building at NewSpace, all right. So NewSpace, we are building an immersive and interactive 3D meeting solution for remote collaboration and designing. So the implementation of, for which you can find the manufacturing, the logistics, or even the air traffic control systems as well. So just imagine, you know, for the traffic controller system, right now, all of the information that they view or all of the air traffic control, they get it in terms of data, that is being represented onto the 2D tables, right, the attributes and tuples. But to be, you know, to make a complete sense out of it in the first glance, we need to transform that thing as well, sorry. So what we can do is we can have a virtual environment on a table top where, you know, you can actually do the real-time monitoring of your air traffic. How convenient would that be, right? Same for the maintenance and logistics as well. If you can, you know, get the data from your actual machinery and have it, I mean, once you have the digital data, you became like, you know, you have this immense power that you can do whatever you want in the, whatever mode of representation you want to do, you can do it in AR and VR. So this is a small interactive video of our product. This, you know, we have done it long ago, so we have done a lot of changes. So, we are going to talk about a lot of new space, a real-time conference institution using the best of both virtual reality and augmented reality. So new space is like the best, except that it is fully immersive and interactive. Here is my colleague, Shiva, who is wearing Coralins augmented reality device. He will be moving to a remote location. Here is Pranayi device. I'll just skip around because we are short on time. So, what are you going to show us today? I see a big screen here and a skeleton in the center of the screen. Yeah, so the skeleton up there is an indicator of... Did you change that? Yeah, I just... So that's an indicator of how we normally learn. It's all about the skeleton. But here, I'm going to use that as a reference to tell you more about the skeleton right here. But here... Which you will see it's quite an easy one, who is very immersive and interactive. Here is my colleague, Shiva, who is wearing Coralins augmented reality device. He will be moving to a remote location. Here is Pranayi, who is wearing an HTC Vive, a third ION VR device. And at another location, in the Prasad, who will be wearing an AVR, a mobile VR device. So they are going to demonstrate our platform, NewSpace, by entering a virtual room filled with medical content, which you will see now. Hey, this is Pranayi. Anybody here? Yeah, I'm here. This is Tugha, running from Gurwiyar. Wow! Nice punisher, Tugha. Thanks, man. Not to say. How? Hey, guys, I'm Hemant. Sorry, there has been a change of plans. Shiva is not here. But how are you guys doing? Yeah, okay, fine. Okay, thanks, man. Oh, thanks, man. So, what are you going to show us today? I see a big screen here and the skeleton in the center of the screen. Yeah, so the skeleton up there is an indicator of... Oh, did you change that? Yeah, I just changed it. So that's an indicator of how we normally learn. It's all about the skeleton. Right, here, I'm going to use that as a reference to tell you more about the skeleton right here. So I can actually walk around it? Yeah, you can walk around it, all the way around it, just be careful, you know. Don't get too close. So, well, going ahead, I can also, you know, perhaps tell you each and every bone. The one that I'm seeing right there on the large screen is a clavicle, which is right here. I can get it very close to it. Unfortunately, in the real world at MN, there's a wall right in front of me. But in the virtual world, I can see the skeleton right here. All right, so as you have seen, like three people on different devices, they connected with each other into a virtual world and they collaborated there with the skeleton. So this is a very, like, you know, a very small demo that you have seen, but it's an actually interactive, so on the HoloLens since it's an MR device, so this skeleton and the place that you see, you can actually see them as a hologram right beside you in front of you, and you can talk with those people. The recording has been done from a VR device, so you see a completely virtual world that is of a classroom and you have some setup over there, so you can have, upload your document and have it shared with, you know, multiple people. So this also is a very useful in terms of, you know, the collaborative meetings that we usually conduct. So for those meetings, right now what happens is you have to travel across the nation or overseas, you know, to just to have, I mean, present your report with the multiple people or right now what the other solution which people take, they go with the WebEx or Google Hangouts and they do it on, you know, 2D screen, and they usually do just the video talking. There's actually very lesser of a collaboration, more of a hindrance when you actually want to do it, so most of them you have to travel it, but with the help of the new space, you can actually brought your content, you can brought your 3D models that you want to, you know, discuss over and interact with it, actually play the animation, how is it going to work, which component has what functionalities have it discussed over the device, use any devices of the less of the devices that we have seen and this is actually is a fantastic product to be, you know, building and working on. So coming to the UX part of it, this is usually the, you know, the presentation of the future display is usually presented in a movie, if you see in Minority Reporter Iron Man movies, you see a lot of these, you know, screen holograms popping up in front of you and you're interacting with all of these all at once, but usually for a better design, we wouldn't like prefer, you know, to throw all of this entire list of the data right in the first view to the user. I mean, he's not able to like take a proper interact with any particular information or, you know, so the better way to go about it, you know, have to include some sort of, you know, spatial information as well. So like, let's say for when we are collaborating on a scooty or some component, we can have, you know, spatial-based information or the abstraction for the, you know, information that we can do. If you see most of the application these days, what, even they are in VR, they are still sticking to the 2D screen to put all of the information and display. So you'll have some sort of panel on top of your hand or in the world and then you'll scroll to the list of the object, do the particular action, but rather what the key thing that all of them miss is they have this entire virtual space to take care of. All right. So this is one good UX. All right. So now to give more perspective into the UX, I'll request Shyam to, you know, give his views on it. Sure. Nice, Shubham. So the multi-device user experience is really challenging when we started, you know, at the, you know, at the, you know, very beginning of our new space. So, so many challenges, so many learnings we have encountered. One of, a few of these are these, okay. So one of these is enable object interaction among users. So I have a slide, you know, later than it gives you an idea of how, you know, we interact in a 3D space to interact with an object. So it's really challenging because in the real world, you and I are here. You and I are here. So if you wanted to move something, so we have hands to easily know that, but there are many devices which my friend has shown you, such as the MR and yeah, and we are. So all these devices have their own input controls. If you see Oculus Vive or HTC, and you will see they have controls through which you can easily grab objects. But when it comes to mobile device, it has, it has no input actually. So all the input is the screen itself. So once you wear this headset on your, you know, cardboard or in any other devices, such as Google, sorry, Samsung Gear VR or other, you know, cardboard devices, you know, you see in market and one we have, you know, we have our own, you know, cardboard, which is very portable and, you know, foldable. So when we wear, you know, the problem with the interaction, so all you have is the gaze cursor. I think everybody is aware of gaze cursor. So in VR space, how we trigger or how we interact with the objects is the gaze. So gaze is a small dot through which we know if we trigger, you know, or we click the buttons. So it is really challenging to do so many actions with one single gaze cursor. So, of course, there are a couple of approaches for, you know, triggering objects. In UI design, we have three approaches. One is the cylindrical, you know, UI approach. The other one is the schematic, you know, user approach, user interaction. The other one is the flat, you know. So we have implemented all these three, you know. But along with that one major challenge that we have faced was in our new space. So one object we wanted to do multiple actions. So this was, you know, a hurdle for us. So one of our developer named Daval, he has, you know, come up with a thought. We can use one single option which is called pinning. So anyways, I'll just show you that slide and come back. Yeah, you see there. So underneath there is a, you see this one. So this panel is the one for the user to interact in the mobile space. So it is very difficult, you know, to do certain actions. Let's say you want to play some object or if you want to move object or you want to change object. So what he has come up with is putting that, you know, pin and it will be lining with, it will not be lining with the reticle anymore. So you can gaze anywhere. Once you're done with that, you can move it. See, other challenge was moving around the space. So the moving around the space just like how I walk here. So it is the gaze cursor. So if you trigger on the top, so you're able to move around. Okay. So I'll take you back. Okay. So that is one of the challenge that we, you know, really faced. Yeah. So quickly I'll just, you know, show you the slide. These are some of the, you know, user experience approaches for VR. So we all know that, you know, applying the general VR, I mean, not VR, user experiences for all the applications or the services that we provide. So that you can still apply along with this following, the interaction challenges we need to think before. And the motion sickness is one of the really challenging thing, you know, people get really sick. If the FPS or they're not fully immersive into that, you know, space. Anyway, so we are running out of time. So I'll end up with here. You can show some. So we can show you some of these screenshots for offer applications. Okay. So this is our space. This is one of the room we call as a new room. So in the between, there is an object called Bosch washing machine. So dishwasher. So these three, you know, users are, you know, interacting. So one of the user is the probably Oculus user, you know, on the seat, on that, you know, the dishwasher, there is a digestive, you know, UI is there in a magenta color. So this user is controlling with that. And the other two users are looking. So through which they can open the dishwasher, they can open, I mean, they can keep the plates and they can move around. They can see the entire, you know, object, how you see in real world. So this is another space where we placed a Ford car. We have a POC with them and which we have shown this, you know, demo for them. So they can review car in our space. And this is another example for a design review for automobile where you can see engine, you know, working, how it works. Okay. Anyway, so I'll end up here.