 here, the SID display week with Kikso, and who are you? Hi, I'm Sang, and I'm the CEO of Kikso, and we built machine learning based user interaction platform. So today, I'm presenting a solution called Touch Tools. It's a solution that allows conventional multi-touch devices to understand your intention just by looking at the pose of your hand touching the screen. So when you're interacting with the interactive surface like this, typically you would use your finger to draw and when you want to erase this line, you typically would need to go to a toolbar, select your razor, come back and erase. But rather than doing that, what if the device can understand that you're looking for an eraser just by looking at the pose of your hand that's touching the screen? So now I'm imagining that I'm holding an eraser, put my hand down, and I get an eraser. But what if you want to draw a straight line? In order to draw a straight line, I need to use a ruler. So ruler is about this big, right? And I can hold my ruler with my fingers and then draw a line. Or if I want to change the color of the line, I'm gonna open up my color palette, which is like a dial, select color, I'm gonna go, let's go with white. I can draw a white line. Or what if I want to measure the distance of this line? In real life, I'll be grabbing a tape measure and then pulling out the tape measure to measure the distance of the line. Or if I want to capture the image, in real life, I'm gonna be grabbing my camera and putting it down on the screen, grab a camera, and then I can do a partial screenshot of that part. So it's basically a very intuitive way to summon the virtual tools without having to come out with a cumbersome gesture or even going to the toolbar. You can come out with, you can summon those virtual tools on the spot, just imagine that you're holding the actual physical tools and putting it on the screen. So it works on an iPad also or an Android? Yeah, it can work on interactive wipers or even on a tablet. So this is a demo to show you how touch tools can work on an automobile environment. So if you're driving a car with interactive surface or the touch screens. Typically in a conventional cars, you use a dial to change the volume or change the music. But on a touch screen, you actually have to focus on the screen in order to operate. So rather than doing that, what if you can summon a dial, like a big dial, just by changing the hand pose of grabbing a dial anywhere on the screen, right? You will get your dial. Or if you want to change the station, I'm gonna grab a smaller dial and turn the knob to change the station. Or if I want to open up a sun shade, typically I got a sun shade that, you know, some roof that I can slide to open. So I'll slide with my finger. Are you able to have this functionality on top of any Android app? Is this Android? Yes, it can work on top of any Android app or even an iOS or even. iOS also on top of anything. Or do you have to run your app to make way to? Our solution is a platform, is an interaction platform, so it can work on top of any app. Or if we want to customize our functions for a certain app, then we can build our engine into the app. All right, do you show a different demo on this one? It's the same demo. It's a demo that's running on Windows. All right, cool. So is it licensed? No, our business model is a software licensing model. So we work with component manufacturers and OEMs to embed our solution into their device. And so how soon is this already shipping? People are having this? So this platform, TouchStools, will be available in the market towards the end of this year. All right, cool.