 I'm here with Tordex, you're doing some advanced AI stuff. Yeah, we are here at Embedded World 2019 and we show here AI at the edge what you can do with our system on modules and our partner network. So here we have a pasta demo, if you have visited us last year you saw a very similar demo, so what did we change? So now here we have an industrial-grade camera, a MiPi CSI camera with integrated ISP. So already some processing on the camera itself. It's from our partner, Allied Vision, and it's a brand new chip, it's called Alvium. So that's just coming out, that's a very early sample. What is Alvium? Alvium is basically the ISP, so they use sensors from the big sensor manufacturer, put it together so you get a system, it's long-term available, it's really made for industrial automation. And the special one so far was also Gigabit Ethernet or USB 3.0 and now you can use MiPi CSI. So this is a big thing. So it goes on the board next to the IMX8QM. Exactly, so what we run here is a Polis IMX8 Quad Max, it's here under the heatsink and you can see no fan or anything. One of the most powerful ones, right? Yes, this is one of our highest, then there is the highest end module, it has two A72, it has A53 and it has two cheap use. I mean here actually we don't utilize all of that, but you have a lot of headroom. We also work with a partner called OOzone and they created a whole deep learning framework, how you can go from basically TensorFlow, for example, to an embedded device. So then there's a hook into the AI systems from Google or Cafe or something? Exactly, you can use that but optimize it for embedded device because these are really many times made for surfers, for very high performance, it's very good for proof of concept prototyping, but when you want to deploy you need to optimize them a little bit and that's what we show here. And I mean pasta, there's no finished one but it was very easy to create and you can detect different pastas. Then here next demo, this is a real world product you can actually buy in a store. There's a camera down there, there is a solar cells and the idea is that this would be at the edge of a pool so you actually set that down on the pool and the camera is underwater and it uses our NVIDIA TK1 module and they also use deep learning to detect if somebody is drowning. So if your kid falls down, if it swims it's okay, but if it doesn't swim it can actually detect that there's an emergency and it sends alarm, it also connects to your cell phone, to your cell phone rings and you can rescue your kid or your husband or wife or all of that. So it's an Israeli company called Coral Detection Systems and this is real. We also made it point out here we are part of the ARM leading edge partner network so we really, ARM gives us recognition about our innovation we do for industrial products. So what's the leading edge partner? That means that ARM has two programs, Innovator programs, that's more for maker, like very early stage company and then the leading partner is for a commercial industrial application and it gives you recognition if you contribute to the ARM ecosystem. Nice, and here you have something on with Intel. Exactly, so we also show here a TK1 vision kit, actually it's the TK1 vision kit but we show that it's pin compatible so we run it with the IMX date called MAX but here the heavy loading of the deep learning inference runs on the backside on a mini PCI from Intel running a Movidius AI accelerator. So in there, inside there's a Movidius, that's one of the most advanced people using drones and stuff. Yeah, exactly, you can drone, it's very power efficient and it's very dedicated and cheap and we see dedicated application mostly in low power, we have some smart street light where customers begin to deploy this kind of solution. And this is a demo that shows its high performance? Exactly, that's a demo, I chose that it runs, the demo is provided by our partner and Micra, they're very specialized in computer vision, they can help you and maybe don't know a lot of experience, they can help you to realize a product and they're very familiar with our product. What is it? Gerfalken? Gerfalken, yeah, this is basically a similar system, here we have connected our USB 3.0 and we work closely here directly with that company, also driven by a customer to bring a high performance AI at the edge to lower performance devices. And here we run it on a Polis IMx86, it's a quad core, it's quite powerful but you can also run it on much lower end devices. So this is a DNN inference accelerator? Yeah, DNN deep neural network inference, they use special technology basically processing in memory, that's kind of their secret source, how they made that. And our point is really to collaborate with them, so if our customer is interested in this technology they already know it works, we have reference design, the SDK, the adjustments are made so it's all about the ease of use. And USB is perfect for that? Yeah, I mean USB is very good for development, for proof of concept, down the road we also expect to have a mini PCIe card or M.2 card, you would connect that a little bit more ruggedized, a little bit better for our... They have that too. Yeah, they have that too. So you start like this and then later you put it on the... Yeah, you know probably on a board but you can even design it down. So they have USB 3.0 PCI interface and even a kind of special EMMC interface. So you're working with some really cool AI stuff, advanced, it's going to be exciting, right? Yes, it's very exciting, we have even one more. So here this is in our other area. Who is the one printing? So we have here a demo with our partner NexNor. Here on our area about the IMX8 ExotMax again. And here we have three regular consumer-grade USB cameras and we have three camera feeds running at about between 30 to 40 frames per second and we do people detection. And we only use about 50%, less than 50% of the CPU and we don't use any GPU. So this is highly, highly, highly optimized and you can just use that and then it's ready to go if you want to try that out on our product. And that's here, that's the IMX8 QuadMax. We run that demo, it's there under the heat sink. So you're just running on the... On the ARM course. On the ARM course, not on the GPU. Not on the GPU, no. So you can use it for other things. You could do pre-processing. You can use it for the UI. You can use it for a lot of other things. What is this? Oh yeah, so this is a streetlight. And it has different sensors on it to add quality of the air and it also has a camera and it actually detects how many people walk by here. So this is the AI stuff? It's not really AI, it uses Lora to communicate but it's really processing on the edge. They use a camera but it's not deep learning, it's more traditional computer vision. But it's really an edge computing application. So you're more and more busy with this deep learning stuff. AI, you are on the forefront of that. Yes, and we really make it accessible for the embedded customers. So of course there is a lot of done on university, you know Google, Facebook, Microsoft, they do great work but we make it accessible for our customer, in our volumes, in our industry so they can get access to that technology and we really focus on the ease of use. So how do we get all that from this university in the hands of our customers so they can actually create the product with long-term availability and all that needs they have. Because at Toradex you have experience with this industry, so much experience. You are the right people to enable the AI future too. Yes, I think we understand the capabilities of our customers, we understand their pain points and of course they also tell us if there is something not good. So we know that and then of course we have engineering we have good connections to these start-ups or university or the leading edge of that technology and then we try to bring it together. Do you have many customers that just come up and say what can we do with AI, can you help us get some ideas? No, not really. I mean we try a little bit to give them ideas because they maybe didn't think so for example food processing or we have a lot of machines, a lot of them have cameras and I do believe that many of them could utilize AI but I mean we show them what's possible but the idea is it's basically coming from them and that's their strength. They know their very special market or very special product they have and they just need to know it's possible with Toradex it's not two years of development and crazy it's pretty easy to get started. And then the support for all the stuff? Yeah, so the support it's shared in many times with partners so we make it very easy that the partners of their works we have CI integration to some part to guarantee you that you can get started and then of course if you have a question about the algorithm on XNOR you go directly to XNOR but if there is any problem it's maybe not clear between software or hardware or VSP with Toradex or higher level we're going to fix that with the customer in the background so you can tell Toradex or you can tell the partner and we're going to fix that that's a big value of our partner ecosystem so we're all blaming each other and things like that.