 Hello everybody, my name is Thiago Hayes from ST Microelectronics and today we are here at AWE. Today we are showcasing our latest 6-axis IMUs with machine learning core capabilities. The key device for today is the LSM6 DSV16X which provides outstanding performances in terms of noise but also extremely aggressive power consumption. The device includes a set of smart features like finite state machines, machine learning core and adaptive self-configurability. And for this particular demo we are showcasing how to use the IMU to process custom gestures inside the device, basically leveraging the machine learning core. In this case we are showcasing 4 gestures, stationary position, nodding, shaking your head and head swing. Let's take a look on how those gestures will shape up. So let's take a closer look at the actual demo. We have a set of glasses and on top of those glasses we have our evaluation kit, the sensor tile dot box. Note that this is a customized sensor tile dot box where we are using here the LSM6 DSV16X as the 6-axis IMU, highlighting the head gestures that we are going to be showcasing today. The sensor tile dot box is a perfect evaluation vehicle for this kind of application. It's a ready to go sensor node where you have all the latest sensors from ST and on top of that you have BLE connection with the laptop in this case, which allows you to have great data visualization on the fly. Now with the glasses on, the first gesture is nodding. So as you can see the machine learning core output will show nodding as one, basically identifying correctly the gesture. Now shaking your head, the shake gesture will be triggered and your application will actually know that you are shaking your head. Second one is swing. So head swing also covered and as you can see the decision tree results provided are showcasing the proper gesture following my movement. And last but not the least, obviously the stationary condition, where basically when there is no head movement or none of the gestures identified, the machine learning core will consider the user as stationary. Obviously on top of the head gestures that we just saw in this demo, you can also leverage the machine learning core for a wide range of applications. We have multiple examples available on our dedicated GitHub repository. And obviously for more information you can visit st.com slash mlc. In there you'll find a detailed guide on the step by step on how to use the machine learning core and how to take the best out of the six axes IMU from ST Microletronics. Thank you so much and see you next time.