 Hello everyone, my name is Thiago Heis and I'm part of our MAMS and Sensors product marketing team. And I would like to take this opportunity to demonstrate how to leverage smart sensors with machine learning capabilities to optimize IoT applications in specific activity recognition. In today's world, people have the power to track a wide range of activities that are part of their daily lives. Wearable devices have empowered the consumer electronics market by bringing accessible and always on products that allow such functionality. And of course, sensors play a key role in augmenting the user experience by providing higher accuracy and lower power for optimal battery life. Now I want to take a moment to walk through the key building blocks that compose the machine learning core inside our sensors. As a starting point, our latest IMUs with machine learning core benefit not only from the sensor data coming from the accelerometer and gyroscope, but it also allows the usage of external sensors connected to the IMU through an auxiliary I2C bus. From there, the sensor data flows into the computational block of the device, benefitting from filters such as high pass, band pass and second order programmable digital filters, as well as features that can be selected to better describe the data pattern captured. In this case, the available features are mean, variance, energy, peak to peak, zero crossing, peak detector, minimum and maximum values. After applying those filters and features to the data, the device then relies on a set of decision trees that will run a machine learning model based on the data that you captured, for example, and it will provide as output the different results classified by the implemented model. In addition, the user can also rely on the metaclassifier feature to allow even higher reliability when providing the decision tree results to the main host microcontroller. Moving towards the practical portion of this video, allow me to introduce the key hardware evaluation tool that will be utilized during this demo. The kit is the sensor tile dot box, a ready to go IOT node, including all the latest consumer grade sensors from ST and, of course, our LSM6DSOX 6-axis IMU with machine learning core. The sensor tile dot box can support a wide range of operational modes, and for immediate out-of-the-box evaluation of the machine learning core, our customers can experience the basic and expert modes. In addition to the hardware tool, one of the key elements of this platform is our dedicated mobile app, called STBLE Sensor App, available on both Google Play and the App Store. And finally, in order to access a wide range of machine learning core configuration examples, you can access our dedicated repository, including step-by-step guides on how to get started with this powerful technology. Alright, now let's go to the practical part of the activity recognition demo using the sensor tile dot box. With the sensor tile dot box hardware in hands, the first step is to connect to the dedicated mobile app that will act as the main user interface during this demo, the STBLE Sensor App. I took the chance to include here in this screen both QR codes for you to scan and download the app regardless of the platform that you are using for your own evaluation. And with the STBLE Sensor App open, the first step is to click on Create a new app button on the bottom left of the screen. From there, a list of multiple preloaded application examples that are part of the basic mode of the sensor tile dot box will be listed. You can scroll down and visualize all the different application examples that are supported out of the box. And since this is an activity recognition demo, let's click on Human Activity Recognition Application Example. Once the app page opens, you will find a short description highlighting the key aspects of this demonstration. In this case, the key points to highlight are We are using the LSM6DSOX, our ultra-low-power 6-axis IMU with the machine learning core functionality, and since this is an activity recognition demo, it includes common activities such as stationary, walking, jogging, biking, and driving. And of course, this algorithm is running inside the machine learning core block of the LSM6DSOX. This example algorithm was developed and visioning as a smartphone form factor, but by following the key steps behind the machine learning core during your development process, you can create your own implementation considering your own hardware form factor. That is very important. The features that were calculated within the machine learning core to support this example are mean value, variance, peak-to-peak, and zero crossing, and the algorithm runs at 26 hertz. To test the algorithm, you just need to click the play button on the top right corner of the app overview section. From there, the STBLE sensor app will load the example application into the sensor tile.box hardware, and notification highlights that the app has been loaded successfully into the hardware. Now, it's just a matter of connecting the sensor tile.box hardware to the STBLE sensor app for real-time testing. Let's start with the standing app or standing steel activity recognition. The hardware will identify the user's standing steel and it will highlight the relevant image of a character standing steel. Now, let's say we start taking a walk through a park with the machine learning core base sensor node. The algorithm will change the output and now recognize that the user is actually walking. If you then start getting speed and actually running, the machine learning core output will also recognize this event and allow you to identify the activity and track, for example, the duration of this exercise cycle. The algorithm also can identify if you're riding a bicycle. This is ideal for today's world, active lifestyle, and keeping track of your thickness activity with an effective sensor node wherever you are. Last but definitely not the least, the library will also recognize that the user is driving a vehicle, allowing, for example, a mobile phone to limit notifications and ensure the driver is focusing on the road. And just to finalize this demo, I just want to highlight that during our day-to-day routine, we explore several different activities. And the machine learning core together with the sensor tile.box is a great starting point for you to start leveraging low-power sensors with machine learning-based technologies to augment your user experience. And with that, I thank you very much for your time and attention and for more information, please visit st.com.