 200 to show you how far things have gone. So we hear the IE TechEx show, and so who are you? Well, I'm Arjun Vomik. I lead the Perceptual Computing Group at Intel. We're developing Intel RealSense technology that brings human-like sensing to computing devices and machines. And what are we looking at here? You're looking at the RealSense camera, LR200 series that is in the market now. And this one? Yeah, it's on the drones and robots. And what you're seeing here on my left hand, the smaller device is the next generation. RealSense 400 series with smaller form factor, longer range, higher resolutions, and better depth quality. So a little device like this can do up to 60 meters? It can go more than 60 meters of range. If you want very highly accurate, then you want shorter distance than that. But as you know, our drones can always fly closer to an object for very accurate depth map. So how can it go so far? Well, this is based on the human visual system with two imaging devices. So as you can see, we see very far. You may not be able to tell what's the distance between the tree and the mountain, but when you get up close, you really know the distance precisely, right? So you can see very far and accuracy scales as you come closer to the object. So if you look at the device again, is there like a normal camera and an IR camera, or how does it work? So typically, let me just give an example of this device here. It has two IR cameras. It has an RGB camera and IR laser projector. And inside the package, there is a special chip that we built, the RealSense imaging processor. That does all the computation in low power. Is it like an ISP or how does it work? No, ISP is just an image signal processor. There's this one function. We have 3D computation hardware, the stereo correlation, the binuclear disparity calculation, the depth calculation, all of those algorithms are coded into the hardware. And that chip is not an arm, right? And it's not an X86? No, this is not a general purpose processor. This is really a fixed function hardware custom built to run specific algorithms. All right. So there's Movidius and there's a Project Tango. How does that compare? Well, you might have seen the news that we acquired Movidius. They're part of my team now. Movidius brings deep learning technology into these devices. So I showed in the presentation how you can do low power 3D point calculations and understand the world. And then you run deep learning algorithms on a low power chip like Movidius to understand what you see. So that's yet another chip, right? Yes. And then you connect it with a general purpose processor like an X86 or an arm for the general compute. And Project Tango, are they using your technology? Or could they use your technology? Can it go down into phones and tablets or? So I can only comment what you've shown publicly together. You should ask Google about that. We have good partners. We have shown devices that include real sense Intel processors and power powering Tango phones. But they have various technologies in their portfolio and they're also working broadly with the industry. And then this one you're talking about, it also has an atom in here. So this is a device that we just introduced into the market as a developer kit. It has a real sense camera, a fisheye camera, a motion sensor, an atom processor, Wi-Fi, Bluetooth connectivity. It's like a computer plus real sense and understanding capabilities all built inside a candy bar. Is it like a Z8300 or? It's got a ZR300 device. And what does that one do? Does it compute extra deep learning on the device? Yeah, you can think of this as the sensing and understanding capability of a robot. So if you literally take this device and put it on an Arduino robot, a low-cost $20 robot, you can turn that into an autonomous machine. Because with this it will be able to recognize the world and navigate. So I don't think we have publicly announced the price, but we don't plan to make profit from here. This is just to enable people to develop systems. It's like it's got everything that you have in your phone minus the LCD screen, but you have a depth camera showing $200 to $300. But there's no battery, right? There is also, so there's a battery pack as well. I'm not showing it here, but you can clip a stripe of batteries in here. How long will it last? Or you could connect with the USB to your another device that has a battery. Does it last for a while? Yeah, it lasts very long. All right, so this is changing everything, right? If everybody starts using this. So how popular is real sense so far? And what's been the history since when is Intel working on this? Well, we started working on it roughly about four years ago, and today it's a pretty significant investment for us. We saw multiple acquisitions we made. We shipped millions of these devices that are already shipping in consumer systems. And we're ramping it from drones, robots, to interactive computers, to virtual reality devices. We think it's, we are the onset of a revolution in interactive computing devices, intelligent systems. And any manufacturer in the industry can just buy the module and put it in? It could be like ARM devices, right? Sure. Like they can just buy it and put it in? Yeah, you could just buy the module and add, you have a computing device already. I showed how unique as a drone maker, they're already having drones in the market, but they added real sense for autonomous collision avoidance. Robots that are already in the market, they added real sense for autonomous navigation capabilities. And the one that's in the market, is it even better than the Phantom 4? Is it the best drone with the object, obstacle avoidance and follow and everything? So I rather not comment on the unique versus DJI, but publicly DJI uses Movidia's technology for computer vision and understanding. The unique uses real sense for collision avoidance. I think they're both great devices. All right, and soon we'll have the ASUS. Soon it's gonna be released, right? Yeah, ASUS showed the chairman, Johnny, she showed the Zenbo robot demo at Computex last June. 2017? And they'll bring it soon to the market. I don't think they've publicly commented on when they start shipping, but should be very soon. So I guess your department is very busy on getting this everywhere, right? Yes, we're very excited about it. We're working with a lot of companies from the robotics, drone, PC, and various other ecosystems to build this technology into their systems. You saw digital mirrors, many examples.