 From the Fairmont Hotel in the heart of Silicon Valley, it's theCUBE covering, when IoT met AI, the intelligence of things. Brought to you by Western Digital. Hey, welcome back everybody. Jeff Rick here with theCUBE. We're in downtown San Jose at the Fairmont Hotel at a little show called, when IoT met AI, the intelligence of things. Talking about big data, IoT, AI, and how those things are all coming together with virtual reality, artificial intelligence, augmented reality, all the fun buzzwords, but this is where it's actually happening and we're really excited to have a pioneer in this space. He's Jack McCauley. He was a co-founder at Oculus VR, now spending his time at UC Berkeley as an innovator in residence. Jack, welcome. Thank you. So you've been watching this thing evolve, obviously Oculus way out front in kind of the VR space, and of course I think augmented reality in some ways is even more exciting than just kind of pure virtual reality. So what do you think as you see this thing can develop from the early days when you first sat down and started putting this all together? Well, I come from a gaming background. That's what I did for 30 years. I worked in video game development and particularly the hardware and things, console hardware. That's right, you did the guitar hero. Guitar hero? Yeah, that's right. I got that one at home. Built their guitars and designed and built their guitars for Activision and when we were part of Red Octane, which is a studio, I primarily worked in the studios, not the headquarters, but I did some of the IP work with them too. So to your question, what has, what is, you know, when you produce a product and put it on the market, you never really know how it's gonna do, right? And so we made two developer kits, put them out there and they exceeded our expectations. And that was very good. It means that there is a market for VR. There is. We produce a consumer version and sales are not as what we expected for that particular product. That was designated towards PC gamers and hopefully console games. But what has done well is the mobile stuff has exceeded everyone's wildest expectations. I heard numbers, Gear VR, which is Oculus design product form, we sold 7 million of those. That's a smash hit. Now worldwide for phone mounted VR goggles, it's about 20 million. And that's just in two years. So that's really intriguing. So what has happened is it shifted away from an expensive PC based rig with $700 or whatever it costs, plus $1,500 for the computer to something that costs $50 and you stick your cell phone in it. And that's what people, it doesn't give you the best experience, but that's what has sold. And so if I were doing a startup right now, I would not be working on PC stuff or working on mobile stuff. And the next thing I think is, which will play out of this is, and I think you mentioned it prior to the interview is the 360 cameras and Google has announced a camera that they're gonna come out and it's for their VR 180 initiative, which allows you to see 180 video in stereo with a cell phone strapped to your face. And that's very intriguing. There's a couple of companies out there working on similar products. Lucid Cam, which is a startup company here has a 180 camera. It's very, very good. And they have one coming out that's in 4K. They just launched their product. So to answer your question, it looks like what is going to happen is for VR is that it's a cell phone strapped to your face and a camera somewhere else that you can view an experience, a concert. Imagine taking it to a concert or a sporting event where 5,000 people can view your video or 10,000 from your seat. That's very intriguing. Yeah, it's interesting. I had my first kind of experience just not even 360 or live view, but I did a periscope from the YouTube concert here at Levi's Stadium a couple of months ago just to try it out. I'd never really done it. And it was fascinating to watch the engagement of people on that application who had either seen them the prior week in Seattle or were anticipating them coming to the Rose Bowl, I think, within a couple of days. And to have an interaction just based on my little mobile phone, I was able to find a rail, so I had a pretty steady vantage point. It was a fascinating, different way to experience media as well as engagement, as well as kind of a crowd interaction beyond the people that happened to be kind of standing in a circle. What's intriguing about VR180 is that anybody can film the concert and put the video on YouTube or stream it through their phone. And formerly it would require a $10,000 camera, stereo camera, set up professionally. But can you imagine though that a crowd source sort of thing where the media is sourced by the crowd and anyone can watch it with a mobile phone? That's what's happening, I think. And with Google's announcement, even that reinforces my opinion in ways that that is where the market will be. It's live events, sporting events. Right, it's an experience, right? It all goes back to kind of experience. People are so much more experience driven these days than I think, thing driven. From everything from buying cars versus taking an Uber and you see it over and over and over again. People want the experience, but not necessarily, SSCO of Zora said the straps and straddles of ownership, let me have the fun, I don't necessarily want to own it. But I think the other thing that gets less talk about, get your opinion, is really the kind of combination of virtual reality plus real world, augmented reality. We see the industrial internet of things all the time where you go take a walk on that factory floor, you put your goggles on, and not only do you see what you see that's actually in front of you, but now you can start to see, it's almost like a heads up display, certain characteristics of the machinery and this and that are now driven from the database side back into the goggles. But now the richness of your observation is completely changed. Yes, and in some ways, when you think of what Google did with Google Glass, not as well as we'd like to see it happen. But first attempt? Yeah, they're way ahead of their time and they'll come at a time when Snap has their specs, right? We've seen those, it's not augmented reality, but they'll come a time when you could probably have a monocle on your face and see the kinds of things you need to see. If you're driving a car, for instance, that heads up display or projector projecting right into your retina. Right. So I think that's the main thing for augmented reality. Will people, I mean, you Pokemon Go, that's kind of an AR game in a way. You look through your cell phone and the character says fixed on the table, right? Or wherever you're looking for it. I mean, that uses a mobile device to do that. I can imagine other applications that use a mobile device to do that, and I'm aware of people working on things like that right now. Right. So do you think that the breakthrough on the mobile versus the PC-based system was just good enough, was good enough? And being able to just experience it so easily, you know, I mean, Google gave out hundreds and hundreds and hundreds of thousands of the cardboard boxes, so wow. Yeah. Well, it didn't mean that Gear VR didn't move into the market. It did, you know, it did anyways. But to answer your question about AR, you know, I think that, you know, without having good locals, I mean, the problem with wearing the Google Glass, I mean, since the Google Cardboard and Gear VRs, it kind of makes you sick a little bit. Right. And then the localization part, like how to get that, get rid of the nausea effect. I watched a video that was filmed with Lucid Cam of the Pride Parade in San Francisco, and I put it on and somebody was moving with the crowd and I just felt nauseous, you know, so that problem probably is a one I would attempt to attack, you know, if I were going to build a company or something like that right now. Right. But I wonder too, how much of that is just kind of getting used to the format, you know, because people when they first put them up for sure, they're just like, ah, but you know, if you settle in a little bit and you know, our eyes are pretty forgiving and get used to things pretty quickly. Your mind can get accustomed to it to a certain degree, but even I get nauseous and I don't get nauseous very easily. Okay, so your title should just be Tinker, right? Looked at your Twitter handle, your build and all kinds of fun stuff in your not a garage, but your big giant lab. Yeah, right. And you're working at Berkeley. What are some of the things that you can share that you see coming down the road that people aren't necessarily thinking about that's going to take some of these technologies to the next level? I got one for you. So you've heard of autonomous vehicles, right? Yeah, yeah. And you've heard of HoloLens, right? Yeah. HoloLens is a augmented reality device you put on your head. It's got built-in localization and it creates what's, it uses what's known as SLAM or SLAM to build a mesh of the world around you. And with that mesh, the next guy that comes into that virtual world that you mapped will be way ahead. In other words, the map will already exist and he'll modify upon that. And the mesh always gets updated. Can you imagine getting that into a self-driving vehicle just for safety's sake? Mapping out the road ahead of you, the vehicle ahead of you has already mapped the road for you. And you're adding to the mesh and adjusting the mesh. So I think that that's, as far as HoloLens is concerned and their localization system, that's going to be really relevant to self-driving cars. Now, whether or not it'll be Microsoft's SLAM or somebody else's, but I think that's probably the best, that's the good thing that came out of HoloLens and that will bleed into the self-driving car market. It's a big data crunching number. And in Jobs, he was actually looking at this a long time ago. I was like, what can we do with self-driving vehicles? And I think he had banned the idea because he realized he had a huge computing and data problem. Well, that was 10 years ago. Things have changed. But I think that that's the thing that will possibly come out of this AR stuff is that localization system transported into other areas of technology and self-driving cars and so forth. It's just a little time of its vehicles because everything gets distilled and applied into that application, which is such a great application for people to see and understand and so tangible. Yeah, it may change the way we think about cars and you may just not have our own a car. Oh, I think it's coming up. I think absolutely, this is the car industry. It's ownership, it's usage, it's frequency of usage, how they're used. It's not a steel cage anymore for safety as the crash rates go down significantly. I think there's a lot of changes. You buy a car and it sits for 20 hours a day. Right, right, right. Unutilized. All right, well, Jack, I hope maybe I get a chance to come out and check out your lab one time because you're looking at all kinds of cool stuff. When's that car going to be done? I took it upon myself to remodel a house at the same time as doing that. But the car has been moving ahead. It's September, I think I can get it started. Get the engine running and get the powertrain up and running. Right now I'm working on the electronics and we have an interesting feature on that car that we're going to do an announcement on later. Okay, we'll look out for that. We'll keep watching the Twitter. All right, thanks for taking a few minutes. All right, that's Jack McCauley, I'm Jeff Rick. You're watching theCUBE from when IoT met AI, the intelligence of things, and San Jose will be right back after this short break. Thanks for watching.