 Alright, so I'm here at the ODG, I'm going to try on the ODG R9 for the first time. And I see a 4K video right here in front of me. In front of everything, this is a 4K video. Oh, that's cool. So, hello, so who are you? Oh, let me turn it on. Hello, I'm Patrick Johnson, Director of Business Development at ODG. So today, we are showcasing two products. First one is the R9. This is our prosumer-facing product. It has a 50-degree field of view, so it's the equivalent of sitting in the best seat in the house of a movie theater. It has a front-facing, six-dough camera that allows for six degrees of freedom, so be able to do better motion-markerless tracking. It has 1080p to each eye, so it's a high-definition resolution, and it's running the 835 Snapdragon processor. We are the only company that currently has the 835 processor inside of our headset and the first mobile device to get access to it. This is potentially perhaps the fastest off-processor in the whole world. That's right. And it's right there, and it's real. And you're using it? And we're the first ones to get it. And it works. You just had it on your head. Do you have chicken cut up yet? Yes, that's right. These glasses, our R9s are going to be coming out in Q2 2017, and then our next generation, the consumer glasses, which I can show you next, those will be coming out in the second half. Is that the one, the consumer one? Is that the one they're checking out here? There's a lot of people queuing out and trying to find one. So these glasses, the R9s, are what is currently being showcased at this table. So they're all checking those out right now? What's in the demo that they are seeing? So the demo walks through a couple different use cases. One of them is a game that you're in a race car. It's using that spherical camera in the table, and it's actually a first person, you're inside the car, and you're able to see all the people around the table in real time while you're in a virtual car. It walks you through an educational experience with a virtual volcano, and a couple different AR experiences as well. So how do you enable AR? You need to be able to measure distance to start? That's right. Well, there's a couple different ways of doing it. When you have a six-stop fisheye camera like the one we have here, basically you don't need a marker. So what it does is it maps the world, and then you can anchor virtual images to the real world where you'd like them to... But is there any infrared or anything to know the distance of stuff? You don't need to necessarily know the distance as long as you have two cameras. So there's an RGB camera right here, and then the six-stop camera as well. Nice. There's two cameras. Actually, one other thing that we have is on right here, this is a Mipi port, and what you could do is actually plug in different sensors into this port. So if you wanted to use your...put your infrared sensor on it, you would simply just clip it in, and now you have infrared with your... And now you have Project Tango on your face. That's right. Exactly. Potentially. Potentially, absolutely. And is it running Android? Yes, this is actually running Android 7, so the latest and greatest from Android. And you have a different version? So yes. The next version is the R8. These are our consumer-facing glasses. One point I'd mention is the last pair, the R9, that is around 6.5 ounces. The R8 is only four...about four and a half ounces, so it's very lightweight. Is it also the 835? And it's also running the 835. But what's less in there? Because it's lighter. What did you remove? So some of the things that are different is we have...we don't have the same sensors in here. It's a little bit smaller display, so the field of view is 40 degrees versus the 50-degree field of view. Let's see here. So I'll show you the same video playing on these, so this is 7.5 p. So I'm going to try the R8 for the first time right here. So you can see the light goes through. Yeah, it feels like nothing. It feels like sunglasses. Yeah, are you still filming? Yeah. I think so. Yeah, there we go. All right. Yeah. Yeah, it's really clear and nice. I'd like to be able to see the Twitter accounts of everybody and can you do stuff like that? So if you have all the data and add all the people's faces mapped in in database, the front-facing camera could do facial recognition. You just need to sync to security cameras that are in the room with facial recognition. And scan their badges when they come in? I suppose so, yeah. We're really not focused on that element of it, but it's totally feasible. There will be parties where everybody has to wear one. Of course, yeah. If you don't wear the glasses, then you're not part of the party. Networking, you know who is who and who you need to talk to? Yeah, or if you're on a dating application, you get the arrow that pops up and you say, hey, I'm available. All right. So what is ODG? So ODG, we've been around since 1999. The first six generations of glasses have been exclusive to the government. So soldiers on the battlefield are actually using our glasses. In 2015, we opened it up to Enterprise. And now, as I mentioned, 44% of all Fortune 500 companies have either purchased our glasses or have budgeted to purchase them in 2017. Currently, we have been focusing on Enterprise. In 2016, 21st Century Fox came in as our lead investor. And now we're taking two parallel paths, one that will continue to focus on Enterprise with our R7, and then our next family of glasses, the R8 and the R9, will be exclusive for consumers. Consumer and prosumers. Consumers and prosumers. And super enthusiastic geeks. That's right, and the early adopters. So is it possible to imagine applications where you enter a store and they give you glasses on and you can get an augmented experience in the store and they tell you, Don't get that milk. Take the soy milk. It's more healthy for your appetite. Oh, absolutely. We actually showcased one of those applications at AWE in 2016. And the way it worked is you could actually tie in to some of your other nutritional apps, just like you mentioned. You pick it up and you say, hey, these nachos, is this something that I should be eating today? It pops up all the nutrition, and it lets you know, well, if you do eat the nachos, then you're not going to be able to eat anything the rest of the day. Don't take them and get out of the store and go into Whole Foods. But also, exactly. But one other thing to that is you can also do indoor navigation. So if you had your grocery list, you updated it into the glasses, now you can do indoor tracking and you'd be able to optimize your route of what products you needed to pick and where to pick them from. I'm so looking forward to this world where the whole world is mapped and there's a whole bunch of people with Project Tango and stuff and it all gets indoor mapped. You store everything and you don't have to wear it all the time, it'd be nice if you could easily fold them in the pocket and wear them sometimes. Yeah, exactly. And that's really what we're focused on now is most of our customers, they don't have to, they don't wear the glasses from the moment they wake up to when they go to bed. They wear the glasses when they have a job to do and for the consumer case when they have an application they want to run or a movie they want to watch or a game they want to play. How long is the battery? So currently with the R7, our currently available glasses, it's really use case dependent. So if you're doing something that takes a lot of computer vision such as augmented reality or mapping your environment, you're going to get about 90 minutes of battery life. If you wanted to watch a movie or just view your PDF, you get about four hours. Now with the R8 and the R9, these are brand new products to the market. What I can tell you is the 835 processor is 40 percent more efficient than the 805 processor that we are currently running. It's 10 nanometers. It's 10 nanometers. That's right. It's just crazy. Yeah. How did you get the chips early? How can you be the first? Well, you know, I would say that that's just more validation that Qualcomm sees that smart glasses and is going to be the way of the future and ODG, as we like to say, are leading that way. Who glasses great, but it never really happened because they didn't want to sell it. So are you going to sell it? Right? We're selling it today. We've been shipping since November. You can buy our glasses on. You can go online and you can buy our glasses today. The R7, right? The R7 glasses. On based on the A20 or what's inside? 805. So you're shipping 805 and people can buy at $2,000. How much? $2,750. And how much is going to be the R8, the R9? The R9 is going to be, the R9 will be under $1,800, so it will be $1,799. And the R8 is going to be under $1,000. All right, which is not much more than the iPhone. That's right. Yeah, exactly. So for today's standards, you know, for being able to get that hands-free body position independent, that big screen HDTV you can take with you anywhere, you know, it has a lot more packed than just even your iPhone does today. So you have business development at ODG, right? That's right. So how many companies are approaching? How many projects do you have going on right now? So, you know, as I mentioned, the 44% of all Fortune 500 companies have either purchased the glasses to kickstart development. We're in a lot of different stages. Everything from pilot programs to deployment with Fortune 500 companies that are out there. So, you know, this is going to be a big part of the future. And this is technology that not only companies and enterprises use, but the consumer will be using as well. You partner with Google? All right. You have all these augmented apps to work by and stuff. So we're, by the nature of it being Android, you know, we're a power user for Google. Certainly, there is open to being a tighter partnership between Google and ODG in the future. They really have to get into this and help you get them like a nice app ecosystem in there. Not only for ODG, but for in general. They have a Pokemon Go. Does it work? Pokemon Go does work. And in fact, you can search ODG and Pokemon Go. And there's a picture of me catching the first Pokemon with a pair of smart glasses. So how do you catch it? So we have actually what's called an RSM. It's this little finger controller that goes on your finger. It has a track pattern as motion sensors in it. So in order to throw the Pokeball, all I do is just move the RSM like this and I can toss the ball out to catch the Pokemon. Nice. Do you walk around in the street with ODG in your head all the time or do you don't? Of course I do. Why wouldn't I? Yeah. You can see me in San Francisco wandering the streets with it on my face. You do that? Of course. Yeah. Every day. Every day. When I wake up, I actually just wake up and I put them straight onto my head. No, not really. Not really. But you do walk around and you do try different like, I guess, prototypes of future applications. Yeah. I mean, so it's really about the use case. Like when I travel, of course I'm putting on the glasses and I'm watching movies while I'm sitting in that flight. I'm watching a big screen mobile movie theater I can take with me anywhere. Is it comfortable? Does it get tiresome in the eyes? It doesn't, actually. And what we've done is unlike, as you mentioned, Google Glass where you have to look up, that causes eye fatigue. With our glasses, it converges at eight feet where your eyes naturally converge. So you'll never have any eye fatigue from looking through. So are you using a cup in the micro-display or how does it work? So we actually do all of our displays in-house. That's really where we shine is we are the manufacturer of the glasses. All of our patents are around optics, or a lot of them. And so out of the industry, I would say that our optics are probably the best resolution. So you're not using cup in? You're not using Epson? No, not at all. They have their own solutions. How do you compare with Epson? So Epson is a totally different head-worn platform. They're using Prisms and we use OLED. And we use OLED. And so a big difference is, we've done Prisms, we've built waveguides, we've done it all. We've been around since 1999, like I said. And we think right now with our current optics, those are the best design for the highest resolution and the best field of view. Is that a transparent OLED or what is it? It is a transparent OLED. So you're looking through the OLED? That's right. That's correct. Which is crazy. It is crazy. So when it's off, you see through how much? Is it like a tinted sunglasses? Yeah, it's like a very lightly tinted sunglasses that you'd be looking for. You're looking at probably over 60% transmissivity. And that's dual 1080p. And 1080p to each eye. That's right. There's two displays right there. That's right. So it's more like a 4K experience, but it's 1080p. It's crazy. So you're going to do a 4K version? We have a 4K camera that's in there. I'm sure the military has a 4K, right? No, they don't care about the movies. They just want to be able to... Does this make high resolution? They need the pilots and all that stuff, maybe? Well, they may have it in the chips. You cannot say what they have. Yeah, I can't tell you what the military has. All right. So lots of apps. Can the app developers contribute right now? What can they do? Yeah, so like I said, it's Android. So if a, probably I'd say 98% of Android applications work out of the box on our glasses. So you could easily upload your APK onto the glasses and be able to run it. Now, with that said, there's certainly optimizations that you would want to make with a head-worn platform versus a smartphone or a tablet. One, like you mentioned, is see-through. The other is that you can use the motion sensors in it and you can also be able to pin information around your environment versus being fixed to just one small screen. I want to go to the museum and get information next to the paintings. Sure, object recognition. Look at your Mona Lisa, see who painted it, when it was made. It's easy. And just say, give me more information. It's too awesome. It just comes more and more and more. That's right. Or we could have the Mona Lisa walk out of the actual painting itself and come do a little dance for you in front of her. And I'd like to walk around in the city and get information about places that are in the city. Yeah. Right there in front of me. The glasses actually have GPS in them as well. So you can actually do navigation around the city. Object recognition, slam. You know, it's able to do it all. And tells me where's a nice restaurant. Yeah, so think about like your yelp, the monocle. Nobody uses monocle because it's a pain to go up and actually look around to see where the points of interest are. Now with a head mounted display, you can actually just look at where you are and be able to get the ratings that pop up above these restaurants. See the reviews in real time. That's real value. Google App is an app where you can recognize the precious stuff. There's a lot of stuff that needs to be recognized out there. Deep learning. I don't know, image learning. I guess this company is like Google. It needs to be there. It tells you what it is, just based on coming through the cloud. Sure. You know, I mean, we're a hardware company and we do the hardware really well. We leave it up to people like Google and a lot of our over, you know, thousands of app developers that are out there to develop these type of, you know, AI algorithms, computer vision algorithms in the context of the glass. Can you click and stop with your fingers without using the remote? Let's calibrate it to know where you're pointing. So we have different software partners. One of them is Augumenta. They have a software development kit that enables gesture recognition. So let's say you're a technician and you're walking through a checklist. You want to show that you completed a step. You give it a thumbs up. You want to go to the next one. You swipe right. So it really just has to be integrated into the app. And if you want my vision, you add the module. And that's right. With our sensor module, you could put in your NightVision sensor into the glasses. Now you have NightVision. Which one point on that is the owner, Ralph Osterraut, he is the inventor of the PBS7 NightVision goggles. So NightVision, as you know it today, our owner also invented that. Owner of an ODG. So is he around? What is he doing? Is his idea to do all this consumer stuff now? This is his passion, right? Headworn is his passion. He's been doing it for 32 years. Like I said, he invented the NightVision glasses. He's put heads up displays and scuba masks. We have done... James Bond? James Bond. He actually invented the underwater car for James Bond. So if you go to watch the movie, he invented that underwater car. He used to do a bunch of scuba stuff for the Navy too. But I guess the real James Bond's are using his technology for many years now. That's right. So look it up. There's an article that they actually did and they call him the real life Q. He's the guy behind James Bond creating all the cool gadgets like James Bond uses on the film. So there's him and there's Tesla. What's his name? Elon Musk. Elon Musk, that's right. Who are you fighting for? Who is that guy? Something. Well, you know... What? Yeah, maybe so. I would say our boss has a pretty cool history though. Cool. And how does it make you look to the south millions of years with your own consumers and get the price down and all that stuff? Sure. I mean, so 21st Century Fox, as I mentioned, is our lead investor. They are, you know, really focused on this being the next platform that consumers are going to consume media on. And so we're taking steps in that direction. Is LTE built in? Yeah, not yet. Not yet. There's nothing. It's not on the roadmap today. But that's not to say that at some point when, you know, you go through all the regulatory hoops that someday it may be. Yeah, Snapdragon 835 has an LTE modem built in. So you could enable it maybe. You could. You could. Cool.