 We are moving into connected life in which a significant number of devices has been designed to bridge the physical world and then the digital world. Under such context, all the users' behaviour got a chance to be sensed to enable interesting applications. We all know healthy eating behaviour is crucial, and keep tracking of what you eat helps you eat correct. For users who suffer from chronic disease such as diabetes and then did overweight, doubt monitoring is even more important. But you need a tool to help you. So some tools has been developed to that user manually lock the detail for the intake, which is quite tedious. And some researchers leverage AI technology saying, well, we do food recognition by the camera-enabled device such as a smartphone, however, privacy concern raised. So in HKUST, we designed SmileUtensile, which can automatically track your food intake and your food composition without any user involvement. So what is the key idea is that the light spectrum reflected by the food actually depends on the food ingredients. So our system has a rail of LED and our utensil modulates LED sequentially and captures the reflected spectrum. By analysing the reflected spectrum and applying the machine learning technologies, we can recognise the food type. When you can conveniently track users' eating behaviour and then also the food composition and match with your measured blood-cruise value, you can analyse the data and then identify the risky food and even the risky food combination and adjust to your dietary habit. So from this example, you see we are leveraging visible and invisible light, which is traditionally used to enable data communication. Now we are leveraging them to do the sensing and food recognition. An optical sensor signal actually can help us to sense more, for example, gate monitoring, gate analysis. So by deploying a number of LEDs on the ceiling and then also the corresponding sensors on the ground, the nine key joints of a human body actually can be tracked. And medical research shows that the change of a person's normal work actually could be the early sign for some deaths such as a hermetic disease. And also research data shows that continuously in-house walking pattern monitoring could be more sensitive to detect a future cognitive decline. So with the optical sensing technology we describe, we actually can help to predict such deaths. And visible light is not the only communication media can help you to better sense our human behaviour. There are some other ambulance signals we can leverage, including the wireless one and the acoustic one. So in our lab, we leverage the wireless signal and it can be helped to track the motion of the user by the radio signal reflected by the user's body. So many interesting applications like elderly for detection, gate monitoring, even the vital sign monitoring can be enabled by the technology. And same story applied for acoustic signal. We transform the smartphone into active sonar system which can emit the frequency modulated acoustic signal and listen to its reflection. By analysing the data, we actually can track multi-people's breath movement and identify the SNP apnea event. So all these event cases shows us we are leveraging the wireless type of communication media to do better sensing. And this non-contact nature makes continuous sensing possible. And also we get a lot of sense the data, we can leverage the machine learning technology to make use of the data. So with this ambulance sensing capability and data analytical capability, we actually can that health provider see some invisible signs and be able to identify the risk condition earlier doctor can design health care management scheme to provide better life. So our idea cannot only apply for the health, smart healthcare industry. There are many other domains under the smart city can be empowered with this sensing and data analytical technology with more and more connected device. Thank you.