 Okay, hello everyone, welcome to the embedded open source submit. Today, I and my colleague, Hebo, we will introduce a new ZFIR sensing subsystem today. My name is Kehan. I'm from Intel. I'm a principal engineer working on the integrated sensor hub solution, focusing on firmware, software, and algorithm. Hebo, maybe you can also introduce yourself. Hi, my name is Hebo Hu. I also from Intel and I'm a senior software engineer and working on integrated sensing solutions and focusing on software for development. Okay, thank you. Thank you. Okay, so this is a agenda item for today. The ZFIR sensing subsystem is a new concept we introduced this year to the community. So we will go through the background on why do we need it. Also, we will disclose our motivation on what is our goal. We will introduce in detail on how does it work, what it is in detail, what are the features, and also we will show an application on what it can do. We will update the status on how far did we go, what did we implement it, what is not yet. We will also show the demo on how does it work. And in the last, we will show our next step plan and also our roadmap schedule on this sensing subsystem upstreaming to the ZFIR. Okay, so we are very excited to introduce this new ZFIR sensing subsystem to the community today. We are also welcome to you if you want to participate into this development together with us. Thank you. So here is a background. So ZFIR today, as we know, already has a sensor solution introduced. So this sensor solution is based on device tree config. It supports the developers who develop the sensor device drivers with a lot of full sensing features, like sensing fetching, triggering, attribute calibration processing. There are already 140 device drivers in the ZFIR repo and more than 900 comets. So the typical sensor usages today on ZFIR is that the ZFIR application directly calls the sensor device driver and streams the data, configs the sensor in different working mode or behaviors. So this typical usage is very suitable for the simple sensor usages like IoT usages in which the sensor number is limited, application number is also limited. However, we think this sensor solution today still has a lot to improve. So for example, one improvement area is the sensor application today on ZFIR. They are directly interacting with the sensor device drivers in kernel space for all the sensor operations. So it is not that friendly for application to directly use a kernel driver. So there is a lack of high-level abstraction, general function management, multiple client arbitration, and high-level sensor API in the ZFIR sensor solution today. So here is our motivation. In one short sentence, our motivation is to take the ZFIR sensor solution to next level. So what we plan to do is to introduce a new sensing subsystem and framework to support a bunch of new sensing features, which is, for example, the sensor high-level abstraction, general function management, multi-client arbitration, sensor topology, scheduling, fusion processing, timestamping, batching, and so on. So the new sensing subsystem or framework is on top of the sensor device driver. So it will manage all the sensor device drivers in below, provide support sensor common features, and provide a high-level sensor API to the sensor application. So multiple sensor application can use the sensor subsystem to access to the sensor data or configure the sensor. And the sensor subsystem will provide multi-client arbitration, which allow the sensor application to freely use in the future without worrying the confliction with another application. And with additional component, our motivation is that we want to introduce this new sensing subsystem as a sensor framework in the sensor hardware. It will be able to support multiple OS meet multiple OS sensing requirement. And also the CHIE, the Contest Hub Runtime Environment, which is from the Google Android project and is already upstream of module in the Zephyr today. Okay, so my colleague Hebo will start to introduce the detail of the sensing subsystem. So, Hebo, please go ahead. Okay, now let's have a look at the new sensing subsystem architecture overview. So at the bottom layer is the Zephyr's current space. So it's the board BST drivers includes the IO purpose like the IAPC, S2C, I3C, SPI, GPIOs and other things. Also, we have the sensor device drivers like the extra meter sensor and gyroscope, 19 sensors. All of these are the existing sensor stack in the Zephyr's, in the Zephyr. And at the middle layer, it's our new sensing subsystem. It's in the Zephyr's user space. And it supports the high-level sensing subsystem APIs with the multiple client support and Zephyr's high-level APIs to support Zephyr up-layer Zephyr applications. So as the up-layer is the Zephyr's application application and it supports multiple sensor clients at the same time to using the same sensing subsystem APIs. Okay, so for the next. Yeah, this is a major concept for the new sensing subsystem. First, we have two kinds of the sensor abstractions. Includes the virtual sensor is focused on the sensor algorithms and the physical sensor is focused on wrapper for existing sensor device driver and the producing the sensor data to the virtual sensor. And all these two kinds of sensor abstractions were composed like train mode, basic sensors report to the virtual sensors and also virtual sensor can report to another virtual sensors. And based on this train model, we have two major passes, including the config flow and pass and the data stream flow pass. The config flow pass is from the top to the bottom from the application to the sensors. And the data streaming flow as well from the bottom to the top from the sensor driver to the same subsystem virtual sensor. And finally to the application client. The whole subsystem supports the multiple clients with the arbitrations. So each sensor can be opened and the conflict by multiple clients. So with each application client, no need to worry about the compression conflict with other applications. All this will be carefully handled by the sensing subsystem. For example, at the currently we can support the report interview arbitrations. For example, one application set 100 hertz for a S-meter sensor. And another application set 200 hertz for the same S-meter sensors. And the sensing subsystem will arbitrate the report interviews. And this thing using the subsystem can also support the Android SETRAE framework. So it can meet the SETRAE's platform obstruction layers requirement to separate the SETRAE. And also it can support the long SETRAE applications like the HID sensor application to support the Windows clients. And also can support the MQTT Zephyr-related applications to support the IoT products. Okay, next. Okay. So Hubo just introduced a core concept of the Zephyr sensor subsystem. So in this page, we have a table listing the Zephyr-sensing subsystem features. So the scope of the Zephyr-sensing subsystem is focusing on the sensor fusion arbitration, sampling, timing and scheduling. So we will provide a sensor abstraction which is the physical sensor and the virtual sensor concept. And we will provide different data-driven model pulling mode and the interrupt mode. The sensor will be scheduled in the Zephyr-sensing subsystem. And also there will be batching support power management. And we will rely on the existing Zephyr APIs, like sensor APIs, task memory, RTIO and everything, all the sensor object will be configurable from the device tree. And our design principles is, we will take a reference from the Intel proprietary ISH sensor framework design, which was already mass production for more than 10 years with active evolution. Our solution will be fully open source as a Zephyr-native subsystem. So we already started our upstreaming work in 2023. Our first PI is the end-review and already submitted and more PRs is coming later. So and we are working together closely with Google on this upstream work. So the Zephyr-sensing subsystem will be fully configurable. Our key idea is to make it a reusable standalone subsystem. So which means that it will be configurable and also flexible as Herb was just said. It can support multiple different applications like HIE, like MQTT, like HID or proprietary applications. And in the next few pages, we will show an example of the new Zephyr-sensing subsystem application and also the demos of how this work on the real board. Okay, so, okay, so, Hebo. Okay, and this is the example application for the Intel's integrated sensor hub inflation for the Chrome OS. So the Intel's integrated sensor hub, it's integrated microcontroller integrated into the Intel's client SOC. So for the diagram, as you see, in the bottom, there are as such related BSP drivers, like the permasement and the money purpose. And for sensor drivers, we will reuse the existing Zephyr sensor device drivers like the S-meter sensor, GRO, Mac, RS sensors. All of these sensors will reuse the existing Zephyr ecosystems. So based on this existing standard sensor device driver API, we will have fully reuse the new sensing subsystem to measurement the whole world sensor subsystem and with the virtual sensors, which will be required by the Chrome OS next lead angle virtual sensors will be implemented as a virtual sensor with this new sensing subsystem. And based on this new sensing subsystem APIs, we have the Zephyr applications and in this Zephyr application, we have two host link clients. One is the HID host link client, which were wrapper underlying sensors from the sensing subsystem and the wrapper to the S standard HID sensors and exposed to the Chrome OS host through the HID and IIO driver stacks. These stacks are already existed in the Linux driver code. So all of them will be reusable. And another client is the HID host link client. It can consume the same sensors with HID sensors and HID client at the same time. So with the Satcharii framework, if you have some sensor features which want to dynamically overload from the host to the sensor hub, so you can use the Satcharii stack as the meanwhile. Okay, and later we will have a demo video to show all of this, okay. So next we have a look at current status. So at the current already, we support the Intel Chrome client reference platform sensor bombs, next estimator sensors, hinge angle sensors, virtual sensors and AR sensors. And with the features we have implemented as all the sensing subsystem APIs, but except the batching. The batching is still working in progress. And the whole function framework with the physical sensor and the virtual sensor obstructions and already implement the multiple clients with the Satcharii and HID sensor applications running at the same time. For the multiple oppression, we now support the sensor report interview and the data change sensitivities. We, at the current, we never did the standard Zephyr sensor device APIs for the physical sensors. So all the physical sensors, we reused the existing Zephyr sensor device driver just as we mentioned in previous pages. And we fully support the calling and the interrupt data driven models with the single thread scheduling. And all of this can be conflictable with the device tree. So in the device tree, you can describe the whole sensor topologic, topologics and the relationships. And we also implement the end-to-end sensor streaming data path for both the HID IO path and the Satcharii path. So later we will see this in the demo video. And we have started the stream work. The first proposal has been reviewed with Zephyr committee and the first PR has been submitted and now it's under reviewing. Okay, next to how to see the demo video. So we have two demo videos here. One is the demo which on the Intel's client reference board with the SSH to support the Chrome OS. And another example shows it how the sensing subsystem working with steam sensor bombs running on the ARM based board. Okay, let's first to have the demo with the SSH, Intel's SSH. Let's demo the physical Excel sensors and hinge virtual sensor data streaming through the CHRE path. Here is the demo with the Chrome OS running on End-to-Lake P development platform. The left side is the ISH formula log output through the URL port. The right side is SSH console from Chrome OS host. This line is a physical Excel on the base. We can see the base Excel data on the XYZ. It's a unit is a micro G. Since it's lying flat on the table, the value of Z axis is close to one G. This line is a physical Excel on the lit. We can see the lit Excel data on the XYZ. Since it's a vertical to the table, the value of the Y axis is close to one G. With all the data from the two Excel on the lit and the base, a hinge virtual sensor is created. And we can see the degree of the hinge is 96 degree from the log. Next, let's demo the physical Excel sensors, motion detector sensor and the hinge virtual sensor data streaming through HID path. IIO sensor tool is a command line tool based on Linux IIO system. Here we use IIO sensor tool to enumerate all the sensors information, including sensor index and the names. We specify the value of index to be three to show lit Excel data. And here the minus N30 means fetching 30 samples and the minus F100 means setting the frequency to 100 hertz. Then we can see the lit Excel data on the XY and the Z. The unit is meter first, square second. The other Excel sensor is base Excel. We specify the value of index to be one to show base Excel data. Next, the sensor is a motion detector sensor. We specify the value of index to be four. The minus T30 means fetching samples for 30 seconds. When the motion detector is moving, the data is changing from zero to one. And after standing still for a while, the data change back to zero. Here to show hinge angle sensor, we specify the value of sensor index to be two. When we close the lit, the data is approaching to zero degree. When we open the lit again, the data changes from zero to 180 degrees. Thank you for watching the demo. Okay. Let's have a look about another demo which was running on an armored open board. This demo is enabled by Tom Bardick. So thanks for his efforts. Great efforts. So with the syncing subsystem, the syncing supports the same sensor sensors, like the two estimator sensor and the two watcher sensors next to hinge angle and motion detect. Okay. So next curve are introduced our next plan. Okay. Thank you. Okay. So this is our next plan. So our goal is to catch up the Zephyr 3.6 LTS build, complete the Zephyr syncing subsystem upstream work by the Zephyr 3.6. So right now we are introducing the Zephyr sensor subsystem call with the PR submitted in Zephyr 3.4. And we will extend the features and the board support in 3.5, stabilize the solution by 3.6. And in the meantime, we are closely working with Google on enabling the Chrome production support. We will enable the Zephyr sensor subsystem in Google and the Intel Chrome production, meeting all the production requirements. And we will also enable the Intel ISH with the latest the UPX Open Development Board, like the one that just shows the arm open board there is also a Intel ISH open board as well. And furthermore, we will introduce the more and more advanced the sensing features to this new sensing subsystem, which includes the CHI Antoine support, the CROSS OS support with the HID sensors, the human presence sensing, which is a pretty hot topic in the client sensing world today. We will add the sensing batching, which is right now is still working in progress. We will also support the multi-stread processing, which is the important for some heavy workload algorithm processing. And so on, yeah. So, okay, so that's all for our presentation today. And welcome to the embedded open source submit and join our session. And if you have any questions or any interest, just feel free to contact me or help. Okay, thank you very much. Thank you.