 We're here at the Qualcomm booth, and who are you? I'm Matt Branda, Director of Technical Marketing at Qualcomm Technologies. And who are you? I'm Giridhar, an engineer from Modem Systems testing in Qualcomm. So this is crazy right here. You're talking about gigabit LTE, gigabit. We're actually talking about achieving gigabit speeds over an LTE network. How does that work? Yeah, so we can do it in two different ways. Using the licensed or the unlicensed spectrum. So in the licensed spectrum, what we have is a 3CA connection with 4x4 MIMO on two of the carriers and 2x2 MIMO on one of the carriers, which gives you up to 1 gigabit per second. We also use the 256-quam modulation, which enables us to have higher throughput speeds. And here what we have is the licensed-processed axis, which uses the primary carrier on a licensed carrier and the secondary cells are the unlicensed carrier, which shares space with Wi-Fi, Bluetooth and other technologies to achieve gigabit speeds. So you're going to have about 70% of the operators being able to do 1 gigabit speeds in the future. 70%. So this is not just for some places. It's not just for Korea or some strange places. No, no. They will be everywhere. It's global. Yeah, that's what this is about. Globalizing gigabit LTE. What is it going to change for everybody? Is it going to mean faster or mobile? It's going to be faster mobile experience. It's going to enable more virtual experience. We're showcasing over here streaming 360 video over a gigabit LTE link. I work on the model. Sorry. Can I jump in here? So this is a 360, 4K, 60 frames per second, right? Correct. Correct. So this is all over a gigabit LTE link. And that's what makes it possible, is when you have fast connectivity, you can actually achieve these type of high-definition videos with the capacity to support these types of applications. So here I'm just scrolling around. Is doing a live stream from this one? Yeah, so this is a reference to capture one of these videos. We'll be going at the top of the video. Yeah, so this one is a reference device which is probably similar to the device that were used to capture these videos. And this one is doing a live demo right now, as we speak. It's capturing the video around us and stitching them real time on the 835 processor. So it's stitching in real time. It doesn't overheat? It doesn't. You can probably try one of the devices. And running for hours. So who are you? My name's Reed Westberg. I'm in the multimedia R&D team, responsible for the application that's running right now. So again, all of this video is being streamed to these six devices. And right over here, you can see the feed from the video is actually each mishai is being streamed. The mishai stream is being sent to each device. And in real time, being stitched on the GPU. So GPU, CPU, which part is mostly the GPU? The graphics processing unit. Adreno 540, right? I'm not sure the number, but yes. And real time 4K, 60, that's the only one that can do it. That's right. Again, this is pre-recorded content, but it's being streamed over that 1 gigabit connection right now. That's awesome. It's great. That's cool. Another thing, very important, is not only is the video captured in 360 degree, but the audio is captured as well. So this is like a special all-in-all-in-all-in-all-in-all special prototype camera that Qualcomm developed. And it is an example of how to capture spherical audio. So if you notice, as I rotate this screen, the audio actually rotates in this 7.1 system as well. That's so cool. That's awesome. Thanks a lot. Cool. And you also talk about 5G LTE. What is it? How does 5G work? So 5G is a new standard. So we're pushing beyond gigabit LTE and starting to enable new types of things with the 5G new radio. So with 5G, we're not only talking about faster peak rates, but we're also talking about faster uniform rates. So no matter where you are in the cell, indoor, outdoor, or far from the cell, near to the cell, you're going to get those very fast data rates that enable the types of experiences that we're seeing over there like streaming 360 video. What does NR mean? NR means new radio. That's the new standard. So when you think about 4G LTE with 5G, we have 5G NR. And that's the new standard that's being developed to meet some of these extreme requirements that we're starting to see over the next decade and beyond. That's crazy. So you're already talking about gigabit, which sounds awesome. How is 5G even better than gigabit? So beyond gigabit, we're talking, again, it's not just about peak rates with 5G, but how do we deliver uniform rates to a user no matter where they are in the cell? So that when you're in an environment like CES and you're walking around and data rates become very crunched, you can actually achieve those speeds. But beyond just speeds, we're talking about very low latencies. So if you think about an application like virtual reality or augmented reality, where things need to be happening in real time, we're going to deliver that on a 5G network and sub 10 millisecond type latencies over a 5G network. So it's basically fiber without cable. Exactly. So that's what some of the operators are talking about is actually replacing fiber with a 5G network. So that you're getting a wireless connection that's basically like a wire. But that's sad for the cable manufacturers, though. No, I mean, I think it opens up new opportunities for the entire world. So what it takes to actually deliver broadband to fast gigabit speeds to every home, that's not a reality today. So that's going to open up new opportunities for the entire industry. But isn't there still a limit? You can't just like have a million people in the state or not a hundred thousand people in the stadium, everybody getting a fiber over the air. Yeah, sure. There's a limit, right? There's always a limit. But with 5G, what we're doing is starting to access some of these higher spectrum bands. With LTE, we're primarily operating in the sub three gigahertz bands. With 5G, we're actually going to access bands in the three to six gigahertz and also even bands up in 28 gigahertz, 39 gigahertz. And if you go up in these upper bands, you get very wide bandwidths. That's capable to support an environment like CES. But the European Union wants to use 700 megahertz old TV spectrum for the 5G, right? Yeah, so. That has long range. Sure. That goes through the walls. Yeah, so 5G is going to be with 5G Our goal is a unified design across all different spectrum bands. We're talking about low bands, sub one gigahertz, so like 700 megahertz, bands from one gigahertz to six gigahertz, and then those upper bands above 24 gigahertz. And one smartphone, a little baseband in the smartphone is going to support all that. Correct. With some of the advanced technologies we're having, we're now able to support many more bands with the RF front end that we have in our smartphone designs. And this is called what's called expertise. Absolutely, this is what Qualcomm excels at. The complexity that you have in supporting all these different bands and all these different technologies in a form factor that fits in your pocket and it's low power. And nobody else is able to match what you do, right? No, Qualcomm, just as we led the world to 3G and 4G, we expect to lead the world to 5G. But it's still a collaboration system. You can't just make it standard by yourself. Absolutely, we're collaborating with the industry to develop this new standard within the 3GPP standards body and then we're working with the vendors like Ericsson. We just recently announced our trial with Ericsson and AT&T to start testing these technologies in a real network. Is the 835 going to support 5G or that's too soon? That's too soon. 835 is supporting our gigabit LTE that we recently have. Which is quite good. Yes, yeah. But then maybe the next one. Maybe the next one. All right, can we walk over the booth a little bit? Sure. Can you talk about this VR demo or? Sure. So what are the experiences right here? So this is a virtual reality. This is actually one best in CES virtual reality for 2017. The reason it's so special is you're delivering actually six degrees of freedom. So today in most virtual reality experiences, you're getting three degrees of freedom. So you're standing still and you're looking around and it's as you're there. But with six degrees of freedom, you're getting three degrees of freedom. So you're standing still and you're looking around and it's as you're there. So with six degrees of freedom, you can actually start walking around like you're in the actual environment. And that's what we're showcasing with these virtual reality demo that we have here. So that means you are combining Project Tango kind of stuff? You have cameras that knows the surrounding? Correct. So if you actually have cameras that are looking at your surrounding, there's the virtual reality glasses that you see here. So what are we looking at here? This is the actual virtual reality headset based on our Snapdragon 835. IR sensor, extra stuff. Do you want me to open it? Yeah. So this is just our reference device, like an engineering belt, okay? So right here you have a very high resolution display? Yes, it's a 2K display. 2K display? Okay, so 2580 by 1440. And what are the cameras at the back? This one's the depth camera. That one's probably the regular camera. That looks like a flash. That's probably a light sensor. I don't know what that is. So it's not Project Tango, but is it similar? Yeah, yeah. So how is it different from Project Tango? Don't know. I don't know if it's not Project Tango. All right, let's walk around a little bit more. Look out for that. So is it the coolest VR demo in the world? Yeah, one best in CES 2017, yeah. And again, the real differentiator here is giving that six degrees of freedom, which hasn't been possible before, in virtual reality. Sure. I'm sorry. And what do you have, the Power Ranger? You have Power Rangers. This Power Ranger is part of the SAFOR. Yeah, Power Rangers, we partnered with Power Rangers to deliver this experience. It's actually what those people are experiencing is the virtual world that's included the Power Rangers. Nice. How's it going? Hi. So you have HDR10 demo? Hi, can you talk about this? Yes, so what we're demoing right now is HDR10 is the new standard. And so we're just showing you a side-by-side comparison of the two on the big screen. But what we're really showing is that our Snapdragon 835 processor is going to be the first to bring HDR10 to your mobile device. So it does, which one is the HDR10? That one there. The right side. So it can support playback of HDR10? Yes, now, yes, for optimal performance, obviously you want a display that is going to be able to handle HDR10. Is a smartphone's display is able to do that? They will be able to do that. We're going to release a list of them later on this year. It needs super-high brightness LCD, right? Or OLED, it needs very... But otherwise, there could be some other tricks on the LCD. But then you just connect external HDR display. But you don't necessarily have to. But for the most optimal quality, you do want your... Is that 4K 60? 4K. So that's HDMI 2.1 output. It's super-high 2.1A or sub-anit. All right, cool, thanks a lot. Thanks, and you have some more demos over there? This is actually Project Tango right here. How's it using Project Tango? Is it Project Tango? Project Tango is right over here. Right over here. Hey, can I check it out? Yeah, I'm actually demoing it if you like. Yeah, could I jump in? So this is actually in the market, right? Sorry? This is in the market. Yeah, this is the factory flow. So now let's say, for example, you're in the market to buy a TV for your wall. I'm not sure if your wall is big enough for a 75 inch TV. So you can launch Amazon. Yeah, so they made an app just for Tango. Nice. So then let's pick a 75 inch TV. 65 inch? 75 inch. 75 inch. Whoa. So stay is there. Aye, that's cool. Yeah, you can walk. It just knows. It stays right there. It knows the exact size of the wall. That's mind blowing, no? Oh, that's cool. You can go behind the TV and check the ports. That's awesome. This is one example. I can see it's there. Sorry? Have you seen the size of it? Sorry, the actual size of the TV. It knows the size of the wall. You know the size of the wall. So you can place the TV on the wall. You can place the TV on the wall and you can imagine how the wall would look. You can put it on the floor also. Well, this particular app only does TVs. But let me show you another example. That's how you're buying furniture, right? For this space. Does IKEA do an app? Loos does an app and Wayfar as an app. I don't know if IKEA has an app here. So here, let me just put a stool right here. Can I try? Sorry? Can I try? Yeah, choose this one. Which one? You can choose. Yeah, okay. Where do I put it? Here? Can I hold this? Yeah, sure. Okay. Please, can you click somewhere on the floor? Oh, nice. Just click somewhere on the floor. Now you can walk around. Oh, that's cool. I'm sorry. That's beautiful. Can you walk around with me? Yeah. I'm losing it. That's so awesome. That's great. How much is this phone? It's not even expensive. $499. $499, you get project angle. That's awesome. And which is the chipset? It's a $6.52. It's got a Snapdragon 650 too. No, not big. No, you can do this. That's right. If it's real furniture, you can't zoom in on real furniture. You look at it just like I would look at real furniture. And what's the frame rate? It's $60? No, this is running at 30, I believe. 30. Can you see the cameras? What are the cameras here that makes this possible? So, yeah, I can show you the cameras. So it's got the fisheye. It's got a depth camera. It's going to make the camera look better. That's so cool. And this is a flash. That's the future of Android, right? Yeah, I guess so. That's cool. Thanks a lot. Thanks. Thanks. That was fantastic. That's so cool. Here's basically you're doing the same thing that my camera is doing, like a shot down mine. Hi, very similar to that. Hi, my name's Sean D'Icon. We're showing here our audio zoom feature. Basic idea here is you can create these customized wall listening zones where you're only going to hear sound coming from these directions. So let's say you want to record your soccer game. You just want to hear the action in the field. You want to hear all of the parents chatting around you. So let you do that. There's no special. It's always testing the rest, right? Yes. And again, it's different than, say, like a boom mic on your camera where you have to aim it. This is kind of getting sound in all directions at all times, letting you through software and algorithms to decide what sound you want to come into the system. This is a special one that has multiple microphones, right? It's got three microphones. That's the standard. Yeah, pretty standard. Yeah, it does. Is it standard today? Up there, we have launched this on a couple of phones. I believe like the LG V20, which is launched, uses this feature. Again, all the processes have been done on our DSP, so it's super low power. We can also do things like, since we know the direction of sound, we can do surround sound recording. So because you know the direction of sound, now you can get a full 5.1 surround sound mix with your videos as well. Rather than the mono or stereo you're limited to today. Nice, that's awesome because smartphones record very nice video, but sometimes the sound is a little bit noisy or something. Right, and for immersion, like audio is just as important from a Qualcomm standpoint to immersion. It's an inspirational sound, especially in a noisy place like here. And we can do other interesting stuff like voice biometrics as well with our SenseID voice technology. We can look at unique characteristics of your voice so it only responds to you and nobody else. We can even tell the difference between recordings of your voice and your actual voice as well. So it's very difficult to fool. So you can actually, like you push on the face of somebody and it could maybe only record the sound from that person because it can sync, it can recognize it. Just only record that person. Exactly. So nobody else is new to that. Yeah, exactly. And you can also combine it with things like object tracking or face tracking so it automatically steers the beam as well. So a lot of applications to this. And the technology we're putting into automotive, IOT devices, pretty much anything that has voice input. We're looking over there. So here we see the Hey. That's me. All right. So what I've got here is a Stand over here. So what I'm showing here is the capabilities of the Snapdragon 835 processor and being able to run deep neural networks on device, no connection to the cloud needed. So what you see here is a multi-object detector that's picking up a variety of different objects in the field of view. It's a little bit quiet today. Maybe some more people will walk through. The idea is that it's able to either pick the CPU, the GPU, or the DSP to be able to run the algorithm depending on the power and performance profile of the user experience that they're trying to create. Computer vision. It's computer vision. It's more than computer vision. It's actually computer vision. It can be applied to audio use cases like natural language processing and translation, keyword detection, and a variety of other data uses. So I'd say computer vision that you see here in this demo is a very demonstrable use case. But the opportunities and the possibilities are literally endless when you're applying neural network algorithms to problems that are on the handset. We're in a smart, man, sorry about that. Security cameras, drones, personal speakers. So the 835 has pure network on the chip. It actually has the capability of processing neural network algorithms, yes. So it's been optimized for machine learning. So you can't do this stuff on the previous chip? You could, but the performance is incredibly better. Plus you have the ability to do this on our hexagon wide vector extensions, our DSP, so you can get really fine precision. We've supported CAFE in our prior version. Now we're supporting TensorFlow. So you have options to choose from as it relates to the kind of framework that you want to be able to train your model in. We support convolutional neural networks, recurrent neural networks, LSTMs. So it's pretty versatile and it gives OEMs the ability to take their own trained neural net models, drop them into Snapdragon and go. No muss, no fuss. Does that mean we get something even better than the Intel RealSense in the phone? I don't have anything that I compare to from a benchmark standpoint. Or Movidius. What I can say is that you have a variety of options to choose from with the three different cores we have with the CPU or Adreno GPU and our hexagon DSP. Something that is unmatched in the industry from an SOC standpoint. And it's just doing that from a basic camera. Right. Is that a special camera? No special camera. It's using the camera that's on the device. No special hardware needed. That's awesome. Let me jump in over there. Yeah. There's another demo over there. Thanks for coming. Thanks. So here we have, this is stabilization. It's, hi. So how does this work? So this is electronic image stabilization 3.0 running on our latest Snapdragon 835 processor. So it's super stable even though it's shaking. Right. We're shaking the display here at about three hertz frequency. And it's not optical. It's not optical. It's digital, yes. So it has more resolution in the video and it crops it out a little bit? Right, crops it a little bit correct. But it's still a 4K recording? Exactly. So that means the camera is recording more than 4K? Yeah, well, it has to, right? A little bit. A little bit more, not 6K or 8K or some things. Exactly. Maybe that's the next model. Ha, hopefully. That's awesome. Is this something that ships already on other phones or? So EIS has been available in 1.0 and 2.0 and some of our earlier processors as well, 82821. 3.0 is just our latest version of EIS on the 835 that we're demoing here. It basically pulls information from the gyroscope and runs that information across some of the subsystems on the processor like the DSB, the ISB, and the GPU to correct for the shake. So it takes some of that shake out of it. And it's real time? Yeah. I mean, there has to be a lag. There's a little bit of a delay, right? Like the microseconds or something? It'll vary. And these are obviously, these are just our production prototype devices. There's a little bit of a delay as a processor, I think, I'll show it to you. This is awesome to be able to have a stable video. How much better is it compared to the previous versions you had? It's a little bit better. And it's hard to kind of judge that visually without having a side-by-side. But this is the latest version, 3.0. And the algorithms, right? And technologies are always getting better. And we're running it on a more efficient, smaller, and faster processor. So you'll see some improvements if we have a side-by-side. Is it okay to combine optical and digital visualization, or is that you have to choose? Well, I think in most cases, we make it available as EIS. So that the OEMs that we work with have a choice, right? OIS is still a viable solution for OEMs. But EIS being somewhat cheaper, because it's not mechanical, it offers the ability for OEMs to offer EIS, or stabilization, I should say, to a wider range of products in a wider price point, right? But if you have optical, it doesn't conflict with this technology. I'm not sure. I don't know that question. I don't know if it conflicts. I know that we can support OIS as well. We're just showing an EIS solution here today. And does it compensate for some kind of other kind of issues that could be with shaking? So what we're doing here right now is we're basically shaking it in one axis, but EIS does work in all three. So we're shaking it in this axis. It also works like this, and for moving it like this. So it's like a pilot. It's called pitch, yaw, and roll. So it does correct for all that motion, right? And it's something that the iPhone doesn't do. I don't know if the iPhone 6, in this case, has EIS or OIS. But I believe they have some type of stabilization. I'm not sure which one. All right, that's very cool. There's lots of stuff in this chip. Yeah, it's awesome. It's a lot of stuff built in, absolutely. Okay, thanks a lot. You're welcome. So there it is, the Qualcomm booth here at CES 2017. There was lots of demos. And there was the Power Rangers. There was a 360 4-Case, a 360P. There was supposedly the best VR demo ever. I haven't tried it. There's also smartwatches. There's a router. There's more routers. There's a ring. Smart door openers. There's medical stuff. There's more smartwatches. More medical. And some other IoT solutions right here. Real sense competitor, maybe. And then we have the car. Connected car. That was a Qualcomm.