 Hello everyone! My name is Dennis. I'm an application engineer working in MAMS. I'm here in CES 2019 showing my demo about SLAM. So here is the demo. It's about the simultaneous localization and mapping. We are fusing the data of the IMU sensor, our LSM6DSR, plus encoders and Timeofly Array VL53L1X. We are fusing all the data together in order to get simultaneous localization and mapping. The demo is meant to illustrate the ST know-how and sensor fusion algorithms to develop solutions for applications that require real-time localization. The algorithm is on SLAM developed internally in ST and accurate real-time position. It includes, as I said, the IMU, Timeofly, and will encoder-based. This is the main SLAM block diagram. We first need to integrate and synchronize the data between the timeofly and the IMU. Once the synchronization is completed, the main block of the extended Canon filter is working. It is sliding in Windows and processing by Windows. We can find a keyframe in the maze. When the loop is detected, we try to update the same features in order to better estimate and close the loop to fit the features. This is the ardor description of the demo. We have our VL53L1X Timeofly sensor. The nine sensors will provide 176 degrees field of view. We have a resolution of 1.5 degrees and a sampling range of 5 hertz. Then we have the LSM6DSR IMU that has a sampling frequency of 100 hertz and an excellent biostability of 5 degrees per hour. And a lot of digital features that includes a FiniState machine and machine learning core. The algorithm is fusing the IMU Timeofly encoders and giving a real-time loop closure, enabling an accurate area mapping. For any other information, visit our website ST.com. Thank you and have a great day.