 Live from Austin, Texas. It's theCUBE, covering South by Southwest 2017. Brought to you by Intel. Now, here's John Furrier. Welcome back everyone. We're live here in the theCUBE coverage of South by Southwest. We're at the Intel AI Lounge, hashtag Intel AI. And the theme is AI for social good. So if you really support that, go on Twitter and use the hashtag Intel AI to support our cause. I'm John Furrier with SiliconANG. I'm here with Robert Scobal at Scobalizer. Just announcing this week the new formation of his new company, the Transformation Group. I've known Robert for over 12 years now. Influencer, futurist. You've been out and about with the virtual reality, augmented reality, you're wearing the products. You've been all over the world. You've just had Mobile World Congress been following you. You are the canary in the coal mine, poking at all the new technology. Well, the next five years you're going to see some mind blowing. In fact, just the next year, I predict that this thing is going to turn into a three ounce pair of glasses that's going to put virtual stuff on top of the world. So think about coming back to South by Southwest next year. You're wearing a couple pairs of glasses and you are going to see blue lines on the floor taking you to your next meeting or TV screens up here so I can watch theCUBE while I walk around the streets here. It's going to be a lot of crazy stuff. So we've been on our opening segment. We talked about it. We just had a segment on social good around volunteering but what the theme is coming out is this counter culture where there's now the humanization aspect. They call it the consumerization of IT in the past but in the global world, the human involvement is now has these immersion experiences with technology. And now it's colliding with impacting lives. Well, absolutely true. You know, this is a Microsoft HoloLens first of all and HoloLens puts virtual stuff on top of the real world but at home I have an HTC Vive and I have an Oculus Rift for VR and VR is that immersive media. This is augmented reality or what we call mixed reality where the images are put on top of the world so I could see something pop off of you. In fact, last year at South by I met a guy who started a company called iFluence and he showed me a pair of glasses and you look at a bottle like this and a little menu pops off the side of the bottle tells you how much it is, tells you what's in the bottle and lets you buy new versions of this bottle like a case of it and have it shipped to my house all with my eyes. That's coming out from Google next year. So the big thing on the immersion, the AR, I mean, you look at what's going on in societal impact. What are the things that you see? Obviously, we were seeing at Mobile World Congress the four pillars came out. Autonomous vehicles is game changing. Smart cities, media and entertainment, the world that we know close to our world and then smart home. Oh yeah. And smart home's been around for years but autonomous vehicles truly is a societal change. Yes. I mean, the car is a data center now. It's got experiences. And there's three new startups you should pay attention to in the new cars that are coming in the next 18 months. Quantergy is one, they make a new kind of LiDAR, a new sensor. In fact, there's sensors here that are sensing the world as I walk around and seeing all the surfaces, right? The car works the same way. It has to see ahead to know that there's a kid in front of your car, the car needs to stop, right? And Quantergy is making a focusable semiconductor LiDAR that's going to be one to watch. And then there's a new kind of brain, a new kind of AI coming. And deep scale is the one that I'm watching. The deep scale brain uses a new third company called Luminar Technologies, which is making a new kind of 3D map of the world. So think about going down the street. This new map is going to know every pothole, every piece of paint, every sign, every bridge on the street, and the brain, the AI, is going to compare the virtual map to the real map, to the real world, and see if there's anything new, like a kid crossing across the street, right? Then the car needs to do something and make a new decision, right? So three new startups are going to really change the car. But the reason I'm so focused on Mixed Reality is Mixed Reality is the user interface for the self-driving car, for the smart city, for the internet of things, fields in your farm or whatnot, and for your robot, and for your drone, right? You're going to have drones that are going to know this space, and you can fly it right, I've seen drones already in the R&D labs at Intel, you can fly them straight at the wall, it'll stop an inch from the wall because it knows where the wall is. Because it's got the software, it's got the sensors, the internet of things. We are putting out a new research report at Wikibon called IoT and P. Yep. Internet, things and people. And this is the key point. I want to get your thoughts on this because you nailed a bunch of things and I want you to define for the folks watching what you mean by Mixed Reality because this is not augmented reality. Well it is. You're talking about Mixed Reality. It is augmented reality. It's just- But why Mixed Reality? We came up with a new term called Mixed Reality because we have augmented reality on phones. But the augmented reality you have on phones like the Pokemons that we've been talking about, they're not locked to the world. So when I'm wearing this, there's actually a shark right here on this table and it's locked on the table and I can walk around that shark and it seems like it's sitting here just like this bottle of water is sitting on the table. This is mind-blowing and now we can actually change the table itself and make it something else because every pixel in this space is going to be mapped by these new sensors on. So let's take that to the next level. You had mentioned earlier just in your talk just now about user interface to cars, smarts. You didn't say user interface to cars. You didn't say just smart. You said you kind of implied, I think you meant as interface to all the environments. Can you expand on your thoughts on that? You're going to be wearing glasses that look like yours in about a year, much smaller than this. This is too dorky and too big for an average consumer to wear around, right? But if they're three ounces then they look something like what you're wearing right now. Some nice Ray-Bans, yep. And they're coming. I've seen them in the R&D labs. They're coming from a variety of different companies. Google, Facebook, Loomis, Magically, all sorts of different companies are coming with these lightweight small glasses. You're going to wear them around and it's going to lay interface elements on everything, right? So think about my watch, right? Why if I do this gesture, why do I have to look at a little tiny screen right here? Why isn't the whole screen of my calendar pop up right here, right? They could do that, that's a gesture. This computer in here can sense that I'm doing a gesture and can put a new user interface on top of that. Now I've seen tractors that have sensors in them. Now using a glass like this, it shows me what the pumps are doing in the tractors, right? On the glasses. I can walk around a factory floor and see the sensors in the pipes on the factory floor and see the sensors in my electric motors on the factory, all with one pair of glasses. This is why the Intel AI thing interests me, this whole theme, because you just describe requires data, right? So you want, you need to have the data available. The data's got to be frictionless. It's got to be, can't be locked in some schema, as they say in the database world. It's got to be free to be addressed by software. The software that understands what that is, and then you need horsepower, compute power, chips to make it all happen. Yeah, think about a new kind of TV that's coming soon. I'm going to look at a TV like this one, a physical TV, but it's too small and it's in the wrong angle. So I can just grab the image off the TV and virtually move it over here. And I'll see it, nobody else will see it, but I can put that TV screen right here so I can watch my TV the way I want to watch it, right? All right, so this is all sci-fi, great stuff, which actually- It's not sci-fi, it's here already. You just don't have it. I have it. You can say it's kind of dorky, but I'm not going to say you're a dork and I know you, but it's too weak. But it's not 10 years away. It's to mainstream America, mainstream world. It's a bit sci-fi, but people are grokking this now. Certainly the younger generation that are digital native, all are coming in post-911, they understand that this is a native world for them, and they take to it like a fish to water. Yes. Us old guys, but we are the software guys, we're the tech guys. So taking it to the mainstream America, what has to happen in your mind to mainstream this stuff? Mostly self-driving cars is coming, it's fleets first and then cars. We have to take people on a journey away from computing like this or computing like this to computing on glasses. So how do we do that? Well, you have to show a deep utility, and these glasses show that. Wearing a hollow lens, I see aliens coming out of the walls. Blowing holes in this physical wall. Like right now? Yeah. What are you smoking? Nothing yet. And then I can shoot them with my fingers because the virtual things are mixing with the real world. It's a mind-blowing experience. So do you see this being programmed by users or being a library of stuff? Some are gonna be programmed by users, like Minecraft is today on a phone or on a tablet, right? Some are, most of it is gonna be built by developers. So there's a huge opportunity coming for developers. You're gonna hear- Talk about the developer angle, because that's huge. We're seeing massive changes in the developer ecosystems. Certainly open source is gonna be around for a while. But what trends do you see in open source? And I mean, I'm sorry, in the developer community, with this new overlay of 5G connectivity, all this amazing cloud technology. There's a new 3D map coming and it's a slam-based map. So think about this space, this physical space, right? These sensors that are on the front of these new kinds of glasses that are coming out are gonna sense the world in a new way and put it into a new kind of database. One that we can put programmatic information into. So think about me walking around a shopping mall. I walk in the front door of a shopping mall. I cross a geofence in that shopping mall. And the glasses then show me information about the shopping mall, because it knows it's in the shopping mall. And then I say, hey, hey Intel, can you show me or Siri or Alexa or Cortano or whoever you're talking to? She's powered by Intel. Right, most of it is powered by Intel because Intel's in all the data centers and all these glasses. Right, in fact, Intel is the manufacturer of the new kind of controller that's inside this new HoloLens. And when I ask it, I can say, hey, where's the blue jeans in the shopping mall? And all of a sudden, three new pairs of blue jeans will appear in the air, virtual blue jeans. And it'll say, this one's a gas, this one's a Levi's, this one's whatever. And I'll say, oh, I want the Levi's 501 and I'll click on it and a blue line will appear on the floor taking me right to the product. The shopping mall companies already have the data. They already know where the jeans are in the shopping mall and these glasses are going to take you right to it. I'll say, Robert, so AI is the theme, it's hot. But AI is, I mean, I love AI and don't get me wrong. AI is a mental model in my mind for people to kind of figure out that this futuristic world's here and it's moving fast. But machine learning is a big part of what AI is becoming. So machine learning is becoming automated. Talk about self. No, it's becoming a lot faster. Faster and available. Because it used to take 70,000 images of something like a bottle to train the system that this is a bottle versus a can, right? Bottle versus can. And the scientists have figured out how to make it two images now. So all I need is two images of something new to train the system that we have a bottle versus a can. And also the fact that compute's available. There's more and more faster processors that this stuff can get crunched. The data can be crunched. Absolutely. But it's the data that trains these things, right? So let's talk about the bleeding edge of AI. I've seen AI's coming out of Israel that are just mind blowing. They take a 3D image of this table. They separate everything into an object, right? So this is an object that's separate from the table that it's on. And it then lets me do AI lookups on the object. So this is a Roxanne bottle of water, right? The 3D sensor can see the logo in this bottle of water, can look to the cloud, find all sorts of information about the manufacturer here, what the product is, all sorts of stuff. It might even pull down a CAD drawing of the computer that you're on, pull down a CAD drawing, overlay it on top of the real product. And now we can put videos on the back of your Macintosh or something like that, right? You can do mind blowing stuff coming soon. That's one angle. Let's talk about medical. In Israel, I went to the MRI manufacturers. They're training the MRI machines to recognize cancers. So you're gonna be lying in an MRI machine and it's gonna tell you whether you, or tell the people around the machine whether you have cancer or not, and which cancer. And it's already faster than the doctor, cheaper than the doctor and obviously doesn't need a doctor. And that's gonna lead into a whole discussion on- And the CRISPR thing, these are societal problems, by the way. The policy is the issue, not the technology. How do you deal with the ethical issues? Ethical? Around gene sequencing and gene editing and- That's a whole other thing. I'm just recognizing whether you have cancer on this example. But now we need to talk about jobs. How do we make new jobs in massive quantities because we're gonna decimate a lot of people's jobs with these new technologies? So we need to talk about that probably on a future cube. But I think mixed reality is gonna create millions of jobs because think about this bottle. In the future, I'm gonna be wearing a pair of glasses and Skrillex is gonna jump out of the bottle, onto the table, give a performance and then jump back into the bottle. That's only four years away according to the guy who's running a new startup called 8i. He's making a new volumetric camera. It's a camera with 40 or 50 cameras around. Skrillex, Martin Garrison, come on. Yeah, yeah, whatever you want. Remember, this media is gonna be personalized to your liking. Spotify is already doing that. If you listen to Spotify- Yeah, of course. Do you listen to the Discovery Weekly feature on that? You should. It's magical. It brings you the best music based on what you've already listened to and it's personalized. So your Discovery Weekly on your phone is different than the Discovery Weekly on my phone and that's run by AI. So these are new collaborative filters. This is all about software. Yeah. All about software. Software and a little bit of hardware because you still need to sense the world in a new way. You're gonna get new watches this year that have many more sensors that are looking in your veins for whether you have high blood pressure, whether you're in shape for running. By the way, you're gonna have an artificial coach when you go running in the morning, running next to you, just like when you see Mark Zuckerberg, he can afford to pay a real coach. I can't, right? So he has a real coach running with him every morning and saying, hey, we're gonna do some interval training today. We're gonna do some sprints to get your cardio up. Well, now the glasses are gonna do that for you. It's gonna say, hey, let's do some intervals today. It's a running bot. And you're gonna wear the watch that's gonna sense your blood pressure and your heart rate and the artificial coach running next to you and that's only two years away. Of course, great stuff. Robert Scoble, South by Southwest. I'll give you a quick one. We got to close the segment quickly. How was South by change in 10 years? And what's- Well, 20, I've been coming for 20 years. I've been coming since it was 500 people and now it's 50,000, 70,000 people. It's crazy. How's it changed this year? What's going on this year? This is the VR year. This is the VR year. Every year we have a year, right? There was the Twitter year. There was the Foursquare year. This is the VR year. So if you're over at Capital Factory, you're gonna see dozens of VR experiences. In fact, my co-authors playing the mummy right now. I had to come on the show. I got the short straw. I got the short straw. Sitting in the sun and instead of playing some cool stuff. But there's VR all over the place. Next year is going to be the mixed reality year. And this is a predictor of the next year that's coming. All right, Robert Scoble, futurist here on theCUBE. Also, congratulations on your new company that you're going out on your own transformation group. Yeah, we're helping brands figure out this mixed reality world. Congratulations, of course. As always, it is a transformational time in the history of our world. And certainly the computer industry is going to a whole nother level that we haven't seen before. And this is going to be exciting. Thanks for spending the time with us. That's theCUBE here live. It's South by Southwest, special CUBE coverage, sponsored by Intel. And the hashtag is Intel AI. If you like it, tweet us at Twitter. We'll be happy to talk to you online. I'm John Forrier, more after this short break.