 Good morning and welcome to this tutorial video from SDMIC Electronics. I am Andrei Vitale, Sensor Ecosystem Manager in the Internet of Things Excellence Center in Santa Clara. Today I'm going to use the Sensor Tile Development Kit. We will see how to use the context awareness libraries and verify their performance. The context awareness running on the sensor tile is based on low-grade data coming from the accelerometer. There are three context awareness libraries, user activity recognition, carry position detection and gesture recognition. User activity recognition is the capability to detect if the user is standing, walking, running, etc. Carry position detection is the capability to detect if the device is on a table, in your trousers pocket, in your hand or if it is next to your head. Gesture recognition is the capability to detect when the device is tilted or shaped. A short introduction to the Sensor Tile. The Sensor Tile is a reference design, an evaluation tool and a development platform. The Sensor Tile is a tiny, square-shaped module. It is only 13.5 by 13.5 millimeters. This module packs a powerful microcontroller, a Bluetooth low-energy radio network processor, motion sensors, accelerometer and gyroscope, environmental sensors, magnetometer and biometer, and a digital microphone. The Sensor Tile is part of the Sensor Tile Kit. The Sensor Tile Kit includes two different motherboards that can extend its capabilities, the Sensor Tile Cradleboard and the Sensor Tile Cradle Expansion Board. If you want to learn how to assemble the Sensor Tile, search the Unboxing the Sensor Tile video on YouTube. Once you have assembled your Sensor Tile, you can connect it to your smartphone. You just need to download and install the ST BlueMS app from the iTunes or the Android Store. When you are ready, power up your Sensor Tile and run the ST BlueMS app on your smartphone. Connect the Sensor Tile to the ST BlueMS app. Power up the device and wait until you see the orange LED blinking. Then launch the ST BlueMS app. Push on the Start Discovering button and you will see the device name on the list. In this case, we are running the old MAMS ONE firmware version 3.2.0. Select the device and you will be connected. User Activity Recognition, Motion AR. Press the More icon at the bottom of the screen. A list of items will appear. The first three items are about the context awareness. First Activity Recognition. Second Carey Position Detection. Third MAMS Gesture Recognition. Let's select Activity Recognition. Activity Recognition will be immediately started. Start shaking the board with a regular pace. In a few seconds, the new activity will be identified. 5 to 10 seconds are required on average. When you are done, press again the More icon and select another item. You can also swipe to move to the next item. Device Carey Position Detection, Motion CP. This is the Carey Position Detection. Now the device is standing still on the table. Watch what happens when I pick it up and hold it in my hand. Slightly tilted with respect to the horizontal plane. So I can look at these insertile components. If I move the device so that it points up, microphone and antenna up, I did it down and I shake it gently to be classified as short pocket. If I move the device so that it points down, microphone and antenna down, I did it up and I shake it gently to be classified as trousers pocket. Now if I pretend that this insertile is a smartphone and I place the microphone next to my ear, after a few seconds it will be classified as head position. When you are done, press again the More icon and select another item. You can also swipe to move to the next item. MAMS Gesture Recognition. Gesture Recognition, Motion GR. Let's select MAMS Gesture Recognition. The device is standing still on the table. If I pick it up, two events will be detected. The pick up event and the tilt to glance event. If I then shake the device, the shake event will be detected. If I move the device so that it is horizontal, and then I tilt it so I can look at it, the tilt to glance event will be detected. Log Context Awareness data. We will now learn how to log the data. Select one of the Context Awareness functions. Activity Recognition, Periposition Detection or MAMS Gesture Recognition. Let's try with Activity Recognition. Press the Share icon on the top right corner of the screen. A pop-up menu will appear. Select Start Logging. Perform some activity. When you are done, press again the Share icon. And select Stop Logging. The app will prepare an email with the log in attachment. The log is a text file, CSV format, comma separated values. The log holds the output of the Context Awareness function that you have selected. During this tutorial, we have seen how to use the Sensor Tile Development Kit to use Context Awareness libraries and verify their performance. I hope you have seen how easy it is to use the Sensor Tile as an evaluation tool. Thank you for watching. Bye.