 here the Computex 2017. Who are you? I'm Adam, I'm with the demo teams. And here's the awesome Samsung Chromebook Plus, I have one of those. So this is pretty smooth, right? Yeah, it's a really nice, really nice experience. It's more like a desktop or a laptop experience and an interesting form factor. You can take this, use it as a laptop, you can flip that around. You can take out the stylus if you can find it, and we can use it as a tablet. This is the ExaCore ARM Cortex A72 and 53. Yep, big little. Super smooth, amazingly awesome. Maybe you can ask if it was for sale in the UK, right? Yeah, I would love one of these. Maybe I can live without the UK keyboard. You can buy them on Amazon.com, they ship worldwide. And here's the Samsung... This is the S8, this is the S8 with the Dex. So that extends the phone out to the PWA HDMI connection to USB, and you plug it in and you have your full desktop experience again. And it's just beautiful implementation. Nice and responsive. Nice and fluid. Drag the windows around. And it's running in amazing 10 nanometers. 10 nanometer... So this is the Samsung Exynos, the M2 Mungus processor. Mali G71 GPU. Here we have it. Exynos Mungus. Mungus 2, they call it. So crazy fast performance, and finally a way to output the phone on a bigger display and really have a nice UI. So you find your mobile device home, you don't need a laptop or PC, you just plug that right in there, and that's your home office. And I think it's the argument, right, that all these Android apps, they are for ARM. Yeah, yeah. They're optimized for ARM. So all this is perfect. Yeah, exactly. And right here in the Chromebook, you have all the Android apps, so... So you have your professional Chrome OS apps and you now also have access to the Play Store Android apps as well. So you Android app to run on the cool screen. Yeah, that's the sound. And here, yeah, there's some other devices like this. So these are the Dash headphones. So these are smart earbuds. Can you take them out? I can take them out. They'll turn on when I plug them in. So what I guess is gesture control, so if we have an incoming call, I can nod my head and that'll pick up the call. I'm also monitoring my heart rate through my ears. Nice. It even can do a translation. The Pro version of the Dash also does a translation. So if you listen to something in a foreign language, it will translate that into your language and play that back to you. And that's using a NXP... Yeah, it's K24. Cortex-M4. Yeah, so the M4 talking to the series in the phone and that has a connection to the cloud. And over here we have an umbrella A12W for the video camera. Yeah, so that's a Cortex processor combined with umbrella's own ISP to do the image processing. And this is very simple. One button plus, push that and it's now filming you for 10 seconds and it will automatically transfer that to the phone and push it up to my snapchat account. Snap up. And over here is a really cool Huawei P10 Plus. It's a Kirin 960 silicon. Again, it's the Mamma G71 GPU. Big little A73 A53. So a nice real good performance as well. This is Pixel and MediaTek X30. It's coming out with a 10nm also. This is a developer device. We're expecting the consumer devices to come out for the X30 basically this month. And what are you showing here with the orange? So here we're doing a demo. This is a calorie counting. So this is using the ARM compute library. If I show the Tangerine there. Pumpkin seed. Apple. And then recognize it medically. So it's offline. It's using machine learning to detect the food and there's some computer vision to try and guess the volume. And from that it's trying to guess the weight and the number of calories in that meal. That's just a proof of concept of what the kind of thing you can do with machine learning and computer vision at the edge offline. Again, using the ARM compute library to run that on the CPU and on the Mali GPU. Nice. This is awesome. And this is just running on Huawei. Yes, you made 809. And here you're showing foveated. I think we've lost it. But it's in there. So here we're doing foveated rendering. So we have some, these are infrared LEDs and the handset here. Can we take the handset out and try to look in? So you modify the Samsung gear to have IR and camera. You see the mirrors there and there's a camera just in the middle. And that's actually watching the user's eye as they're using the headset. And because of that we can optimize the rendering process. So we only render in high quality where we know the user is looking. The rest of the frame can be lower quality and that saves band griff, saves power. So it's, yeah, sorry. So when you have, let's say, a 4K VR display, it would be pretty much bandwidth to render everything, but you could maybe consider rendering just the part you're looking at. Exactly, exactly. And then you would have saving on bandwidth, saving power consumption. Yeah, exactly, exactly. And you can even have future, even higher resolution VR. Potentially, yeah. By only rendering in high quality, exactly where it's needed and lowering it elsewhere. So over here is something going on with the ARM Cortex R52. So the R52 is from the, obviously the R series is targeted for real-time or safety-critical systems. This is an automotive demo here. And what we're showing is the R52 introduces the hypervisor to the R series. And here we're running two separate pieces of software, two virtual processors. So one is running the air conditioning system here, and one is controlling the door locks. And what you want is here, if I introduce a fault into the air conditioning system, you see that stops responding, we get the warning light, but our doors can still open and close. So you have two safety-critical systems and they're separated so you can crash one and the other one keeps running. And it runs on the new R52 real-time chip. So here, this is what we call a fast model. This is a software simulation. It's something as developer tool that we provide so that the software developers can start developing their software before the real silicon is available. Nice. And let's grab this awesome gigabyte board right here. Is it fragile? Can we grab it up? It's heavy. Right here we have... This is... So this is the... This is pre-production. This is for the Thunder X to process it. So the Thunder X1 is obviously available. And that's targeted more at traditional server loads of web services. This is on the right. The Thunder X2 takes that up to high-performance computer applications. But as you can see, we're missing some components here. This is a pre-production board. It's beautiful. It's awesome. I need to do an interview with these guys. Sure, I can do a bit on touch. So these are the latest demos of the whole world of... assistants. They're all unpowered in the world. Yeah, so we have the Google Home and the Voice of Recognition and the Home Hub. We have a smart camera here. It's also using an umbrella, SOC. So that can do people detection. So if you use this as a home security camera, it can tell the difference between your dog running around and maybe some person in your house who you weren't expecting. We can also do things like people counting. So if you have a shop or a cafe, you can use that to monitor the number of people that are coming through your shop. Nice. Is the light unpowered? No, it could be, right? Actually, we do have a... A smart bulb. It's not doing anything smart at the moment. It's just a light. It could be like a speaker or something. It's probably armed. All right, thanks a lot. No problem.