 Hey, good morning and welcome on S.T. Booth again. We are live and that's the best thing this year that we can meet with our developers and engineers. But you know, guys, I'm also very happy to see you and to talk to you because I'm sure you're interested to see and hear more about our stuff that we are showing here this year. So let's check it out together. You've been working on a lot of stuff the last three years. That's true, that's true. We did not stop. The situation was not easy. But our engineers and developers continue to work on the new products and the new technologies and we are definitely showing some of them over here. I've been looking forward to this moment for a couple years. Me too, me too. You know, last year we did the run over the digital platform but this one is much more better to be here together. And what about our guests? Yeah, yeah, exactly, exactly. So, you know, the first thing I would like to show you today is our demo and solution using artificial intelligence for people counting, people detection. And I ask Vincent, my colleague, to show you and explain a little bit more about it. Hi, I'm Vincent from the AI-MPU team and at the CIMA Correctorics. And I will showcase you in this spot. We showcase, okay, in this spot we showcase the AI on the MPU-SMS2-MP1 and we promote our ACMX AI package, which is a software package to run on the MP1 to demonstrate the people counting. Everything is done on the edge today and no private data is transmitting to the network. And so with this neural network running on the MP1 we are able to detect the person, detect the fit, just transmit the fit coordinate to it. And to display it on a virtual map, to display where the people are located today. What's the X-Linux AI? X-Linux AI is a software package that we deliver that my team is delivering to propose all the AI frameworks to run on the MP1, meaning that TensorFlow Lite is delivering it. We also deliver some examples and demonstrations. So it's optimized Linux for AI? Yeah. Unembedded. Unembedded and on CPU only today. And we also use one bot from the Sena system, which is a partner from us. And this is called the MP Cam that you can find on the Sena website. And it's a code board? Sorry? You can just buy it. Yeah. You can just buy it on the Sena website. And doing AI on the edge, it's really important to say energy. Energy and also privacy. So no sensitive data is going outside the bot. And going through the code. Can you also explain the demo a little bit? So we see some dots moving. For the demo, you can see some dots moving. So my dot is there. So now we see a long dot? Yeah. And if I'm outside the area, so this is my dot there. And you can try to keep her in an environment without providing some picture of the person and neither face recognition and so on. Nice. This is cool. Okay. Thanks a lot. Perfect. Thanks. Thank you Insan. Thanks a lot. So let's move on. And we'll continue on the subject of artificial intelligence, which is definitely one of the leading technologies that we are investing in too. It's a lot about the hardware, but many things around the software. So let's check what do we have here over here. I see we are quite busy, but that's good. So maybe we'll start on this side. Okay. Maybe we can start talking. Always customers here. Yeah, so maybe let me introduce you. Here we are showing the NanoHAI Studio. And maybe you guys are already aware that we have won the award in the category of software this year in the world with this NanoHAI Studio. So yeah, Basel, could you tell us and show us how this NanoHAI Studio could help our customers and engineers to develop and bring artificial intelligence into their project? Yep. So here we have a project using a time-of-flight sensor. And we are doing gesture recognition with this sensor. And it can detect if you're going left, going up, switch the blocks and going on the right. So you can do this and classify many different things with thanks to NanoHAI Studio and with NanoHAI Studio. We're just trying many different things I know because we've lost the vibration use cases, current use cases. And now the time-of-flight gives us many different applications and many different things to try. Can you show us the demo with the blocks? So going left, then you can go right. So there's a lot of applications for time-of-flight? It is, but the story here is we are using time-of-flight as the source of the data for the analysis. But you could have a different source of data. You could have a MEMS or vibration sensor as well for all the predicting maintenance. But the real beauty of NanoHAI Studio is that you don't need to invest hours and hours. And you don't need to have data scientists to collect thousands of data, clean them, classify them, and build your neural network model. What the NanoHAI Studio really does is from the basic set of data that you can collect directly from the application, it will select from thousands of different pre-trained models that has its own database, the best one fitting application, then propose it to flash it inside the microcontroller and make the real training on the micro, on the edge again. So the whole process and the time to implement such a solution in application gets much more shorter with low investment. Okay, so let's move on. As you see, so Basel explained the demo with time-of-flight. Here we have the solution also with NanoHAI Studio showing some predictive maintenance with sensors. And I would love to introduce you to Pierre, which is one of the fathers of the solution. So Pierre, if you can say a few words about NanoHAI. Absolutely, so should I look at the demo? Should we have a look at the demo? Let's go. So this is NanoHAI Studio, so this is a software solution by ST that is targeted for embedded developers. And it's for developers who would like to develop a solution including smart features like AI machine learning, but without specific knowledge in machine learning. So the idea is that you use signal examples coming right from your sensor, raw signal examples, and then you use them to find the best possible AI library that is the best specific solution for your application. And what is exciting about this solution is that it enables embedded learning. So you can learn patterns and recognize them directly from within the microcontroller, which is something very unique on the market. So here we're doing anomaly detection using current monitoring. Here we're doing anomaly detection using vibration sensing. So it can be used with any sensor, it's compatible with all STM32 boards. And it's also compatible with the ISPU board, which is a MEMS sensor. So in those types of applications, we're in the edge, we're running embedded AI in the microcontroller, but here we're pushing it even further by embedding ML learning capabilities directly in the sensor itself. So you know that every time you're getting closer to the data, it cuts the costs. So it's a huge breakthrough in the industry. Okay, thank you. Thank you. We can continue with our customers. Absolutely, yes. Have a good show. Go, thank you. Okay, so let's move on guys. Maybe the last little thing we also hear show on AI is similar to what you have seen with our MPU solution with the X Linux AI package. But here we are running also people detection and people counting, not on the microprocessor but on the microcontroller. And I let Guillaume to maybe say a few words about the demo and the best advantages of this one. So here we have two example applications that we can run through our tool called the STM32 QBI, which allows you to convert pre-trained neural networks into optimized C code for the STM32 microcontrollers. So the beauty here is that you would typically see those kinds of application on MPU or a doorbell and it will detect the faces and then for each face you can compare it against a data phase of enrolled people. So right now I am not recognized, but if I go in front of the camera I can enroll my face and now I'm recognized as user too. Anybody else who will go in front of the camera will not be known by the system. The QBI is a software component that we have in X. Oh, the microcontroller. And you can use pre-trained model from TensorFlow, Keras or PyTorps through ONNX and then generate the code and integrate it into your application. So the QBI is a very exciting field to specialize in, right? Many people excited about it. Yes, and this is the beauty. Typically you would require MPU and now we can start to see some applications running on microcontrollers with very low memory and small... How's the performance? The performance is we're starting to see some decent numbers. So right now at 4 FPS, a few years ago we were running only at one FPS. So every year we're making some progress. Is somebody making some magic in the background to make it faster? We have a very dedicated team that is optimizing our QBI library to take advantage of the DSP instructions and dual-issue pipeline of the Cortex M7 microcontrollers. And there's even more stuff that could be optimized? For sure, always. Yes. Okay, thank you Dior. So let's move on and with this we will stop the part that we have dedicated to artificial intelligence. But of course the next one important one is connectivity because the whole world is becoming connected. And one of the two new solutions we are showing on our booth this year are linked to connectivity to a cloud. So the easy way and secure way to bring your data to a cloud if you need. One of them is AWS. And I will ask my colleague Ramkumar. So yes, this is Charbox. And if you could say a few words about our solution with AWS and how the U5 is helping to make this connection reliable and secure. Yeah, here we have AWS IoT running on the STM32 U5. The STM32 U5 is specifically chosen for the built-in security inside the device because it has got secure years. It has got best-in-class security with a hardware unique key. And this is very much important for the secure solutions that need to have the key stored on the device, on the MCU itself. And on this board what we have here, we also have STSafe. If you want to go for common criteria EL5 solutions, you have STSafe on this board that is tied to STM32 U5. And on this board, now it is qualified and running on the AWS IoT Core. So we have that running. As you see here, this is a demonstration what we have here, running AWS IoT Core. And this is a cold chain. The idea here is to carry all the vaccine whilst at the lowest temperature. And also if there is a cold detection, you can identify it. And you can identify if you have a GPS connected to this through any serial interfaces or whatever, you can identify where the location is. And all of this is actually done in the application done by Amazon here. And here you are seeing the temperature alarms, which is already indicating that it is high temperature. And here you are seeing the humidity and temperature in the screen here. And we can also show a GPS screen on this one, that is the idea. And this is running Amplify database on top of the STM32 U5, running AWS IoT Core. And we have pretty much here over there, when we also have the entire Defender application also running. You can do that. So this is a nutshell. U5, STM32 U5 is the best-in-class device from ST to showcase our low-power technology and highly secure inside for AWS IoT customers. When you combine perfect cloud features with perfect hardware, it's like the next level magic happens, right? Yes. So much new ideas and innovation happening in this field. That's true. The need of security is increasing year by year. And like Ramkumar said, the U5 is today best-in-class because it's the only microcontroller today on the market which has PSA level-free and CC level-free hardware certification on the hardware chip. So really we bring another level of security to the hardware or the device. And it's been a very highly repressed feature to have all the security and the IoT. On the IoT, you need to have a secure boot and a secure firmware update solution. And this is implemented by ST and we have that. And this is aligned with the TFM when it runs in the secure and the non-secure zones. And it's a very good solution for a cloud-connected, low-power microcontroller from ST running Cortex M33 core. The AWS is just getting more and more busy now actually. Oh, AWS is very popular in the cloud arena. They have system solutions across the industrial consumer or many applications or even wearable applications. All the consumer applications needing to connect to the cloud. They have careful ecosystem support on the AWS IoT core. Yes, thank you. Thank you, Ramkumar. So talking about U5, let's check after the security as well the low-power features. Because as we said, IoT very often, these are battery-operated applications. Energy efficiency is a big topic as well. So what we try to make sure is the U5 is as well very power efficient. My colleague is right now making a pitch to the customer. So let's not interrupt him. We can come back and let's move on to the RF field. So on this part of our group, we try to concentrate product and solutions for RF technology. So we speak BLE, FRED, ZigBee, the newly introduced matter, or the sub-gigari solutions. Let's first speak with Witte. Witte is our RF expert leading a group of engineers providing solutions and support to our customers implementing RF SOCs of ST. So I let Witte to say a few words about what do we have new and what kind of cool demos we are showing over there. Thank you, Roman. So hello everyone. So we have our great STM32WB. So it's not new, but it's still new for a lot of people. So that ST is also producing radio microcontrollers. So it's a dual core, dual radio device. And what we are showing here, what we brought here, that we also extended our capabilities of the STEC. So is this a mesh? Yeah, this is not a mesh. This is just demonstrating our motoring capabilities. But I see there is some issue with the demo. So it's a demo. It all happens sometimes. Always happens. But when we are demonstrating here the last two days was that we are capable to maintain a connection with eight phones at the same moment. So this is the slave on the BLE. And the phones are the masters. And we can transmit the data from the slave to the masters at the same moment. So that was here. But we are also demonstrating some extra features we implemented on the STEC level. So we implemented extended advertising here. So we can now transfer a lot of data just by using the advertising. So that the device doesn't need to get connected to the master. So the link doesn't need to be established. And we can really have a device like this. So this is demonstrating like a tracker. So it's just sensing temperatures, sensing the location and putting the data in the internal memory. And then when we get close to the scanner, I want to take another one because it was already scanned. So when I take a unit, it downloaded all the data. We found a link just for the advertising. And we also then transferred the data to the tablet. And what is quite nice, small detail about this demo that we are using here. And Bluetooth web API framework, which is very cool. So for every single engineer outside, out there, if you want to implement a very simple user interface for your testing development or even for the application, please look for the Bluetooth web API if you do not hear about it so far. Because it's a JavaScript, Node.js based framework. You can use the Google Chrome web browser to access the Bluetooth adapter. And you can implement an application that will operate with the Bluetooth from the web browser. So it's very cool. What is BLE multi-link? Is it part of the standard? Are you doing something special? So for the multi-link, it means that the device can maintain more than one link. So that it can be connected to more nodes at the same moment. So it can be master link or a slaving. So it can be connected either to one or two. Smartphone at the same moment. And at the same moment on top, it can be a master for multiple slaves. So it can really operate multiple communications. So it can create like a scatter net network, which was there before the mesh technologies. So if you want to create like a small network of sensors or products, which should communicate one to each other, you do not need necessarily to switch to the mesh network technology. You can still use even the standard Bluetooth low energy and the standard state with our device capable to do the multi-link communication. Bluetooth web API is so cool. Is it something that's just part of the spec, something you add to the something? It's not coming from ST. Okay, if you are doing promotion of a technology or a solution, it's not coming from ST. But it's not yet publicly known to the people that it exists. It's on a GitHub, so everyone can use it, okay, it's open source. And every Bluetooth device has a little web state? No, no, no. And you will be capable to just do with JavaScript. It's like that you do not need to open Android Studio and implement an Android application or you do not need to let's say learn some other programming language. You can just use the JavaScript and very simple application that will run in the web browser and will be operating with the Bluetooth. So that's a very good question, okay. So 20 device payload for the standardizing or 27 if you consider the overall data frame. But here we can put there even the extended data frame. So it's 251 overall, so that's the maximum number. I hope I'm not wrong. Yeah, I'm not a good to remember numbers. Yeah, so it is an absolute payload. Yes, it is an absolute payload because we are combining different channels, okay. So we are using extended data frames and we can combine to use multiple packets to send the overall data payload here. So it's a fun feel to work in the Bluetooth slow energy? It's very, very fun. Very dynamic, very changing technology. Okay, perfect. Let's move on then. Thank you, thanks for all this. So it was just an example. Of course we do also show here the latest, let's say arriving protocol method where we have a first solution and we are preparing it to bring it on the market in the coming weeks, let's say. But yeah, you can see the first demo already over here. Now let's go back to the U5. Sorry guys, but I see now the booth is available because that is way nice. An interesting feature of the U5. So yes, let me welcome to Manuel. Hello, nice to meet you. Hi. So Manuel, please tell us something about the U5 is the focus on the low power. What is so unique inside this microcontroller to make it so special for the low power consumption? Yeah, so there are two main reasons. So two main important features. One is security. But here we are focusing on ultra low power. So on this microcontroller U5 we have a dedicated state machine which is working independently from the core. So the core is off, all I'm saying is top two. And the peripherals can work autonomously because the CPU is not fetching any instruction. We have a dedicated DMA instances which is a low power DMA which is fetching the instructions instead of the CPU. So you have a subset of peripherals that you can use while the core is in stop mode. So these modalities call it low power background autonomous mode. And right now is a special feature for STM42 U5. I can show you what are the peripherals that you can use in some use cases. You can use communication peripherals like I2C, SPI, UART. In fact, this is what we are demonstrating here and we'll show you later. We are demonstrating an I2C reading from an accelerometer. And in addition you have a dedicated 12-bit ADC for this low power domain which is called the smart run domain that you can use to store your data. The data are stored in a dedicated SRAM and down to stop to you have 16K bytes of SRAM available. And you can decide to wake up then the core on the strategy you prefer. Transfer complete or transfer complete or when you reach a certain threshold, for example. And other peripherals which are available are DAC for digital to analog converter, low power timer. You can also create control loops because you have also some analog peripherals which are available, comparators or PAMPs. And you have dedicated low power SIO which are available. And very interesting, for example, for implementing your own data transmission protocol. In general, what you can do is chaining a different peripheral to achieve your task and keeping the MCU in low power and achieving ultra low power consumption. So with this solution, you can cut your power consumption by a factor 10. Are you talking about 16K? 16K of SRAM down to stop to. So it means that when you initialize, you build your function, your LPBAM application, you have to be sure that the variables and the handlers that you use in your application are stored in these 16K bytes. When the memory is full, you can then wake up your core, do some computation and then come back to your task, for example. That's a smart strategy. Yes, this is a very smart strategy. Another interesting fact is that, for example, when you're doing an analog conversion, you can also decide to wake up on a threshold. So when you have a certain value, you can decide to wake up the core. Somebody's asking, 128K SRAM would be better, what would you say to that person? Yeah, so you can definitely use more RAM if you go to higher power states. We have to say that for the solution, for the implementation we've seen so far and customers using LPBAM, 60K is pretty okay because you can always wake up for a while doing your operation with 160MHz and then come back to stop to and go on with your ultra-low power tax. So maybe I can give you a short overview of the demo we are presenting here. So we are reading an accelerometer via I2C-3. We have a dedicated I2C-3 instance, which is I2C-3 free. And the acquisition from the sensor is triggered by a low-power timer. And the same timer is also triggering another timer, which is low-power timer-free, which is generating a completely independent task with modulated PWM. So we basically have two DMAs channel doing two different tasks. One is configuring and reading data from the sensor. Here we are reading two times six bytes from the sensor. And we have another DMA channel, which is taking care of generating variable PWM in circular mode. And to do so, we use our U585 disco board, which has accelerometer and gyroscope on it and also some nice connectivity Wi-Fi and billing. And our power shield that is based on L490 with our cube. And we have I2C signals. If I stop the acquisition, you will see that the sampling time is around one millisecond. And on the third channel, we have the modulated PWM. Then we have other free channels, which are used for debugging purposes to see what's going on in the smart-run domain. In fact, this is a very clever way to debug low-power application. When these three signals, which are CDSTOP, CSLEEP and SRDSTOP are high, it means we are in stop two. And when this signal is toggling, it means we have DMA transfer happening. Now we measure the consumption with our ex-nuclear LPMA-01 and our cube monitor power. I will show you. So, we are now measuring the consumption. The application is running with cube monitor power. One very interesting feature is that you can select your time window and you can have a calculation of the average power and the average current that is used. So, in this case, with low-power background autonomous mode and one millisecond timer, we are in the range of 61 microamps. We have built an application which is using, instead, another, which is using what we call legacy mode. So, we have built a benchmark application which is not using LPBAM. So, you have to imagine that the core has to wake up when the transfer is happening and you keep on switching between its top two and run to do this transfer. So, we can also demonstrate in this case that if I flash, for example, the example that it's using this legacy mode, okay, I will show you. We can observe two things from this logic analyzer. First, what is happening on this free channel is a bit different from what we saw before. These lines are toggling. It means I'm waking up very frequently, okay, which is not very good for power consumption. So, in fact, if I measure the power consumption with this legacy application, you will see that the values are around 180 microamps, versus 60 microamps. So, we are gaining a factor free of power with this low power background autonomous mode. So, we can generalize the result saying that with STM42U5 and low power background autonomous mode, you can achieve great benefit in terms of power saving up to a factor 10. In this case, we demonstrate also that the gain you have is increasing as we increase the sampling frequency of the I2C interface. And this result is valid for I2C, UART, and SPI, so for all communication interfaces that we have on STM42U5. If you want to discover more, I'll leave you with some reference links from our application notes, which are mainly related to low power and to one of our latest on demand webinar, which is in fact showing how to build this application from scratch. Thank you very much. Thanks, Manuel. Very great pitch. Thank you. Thanks. We have any questions? It sounds like the STM32U5 as a beast. Yes, it's really our latest product based on the 14-millimeter technology, so very progressive and latest one. And really the combination of performance, security and low power enabled this to create many of our customers, let's say the dreaming applications. Okay, so let's leave a little bit the part of the booth where we speak about microcontrollers and microprocessors and the different solutions around and move to see some of our other products. And in the live chat, please come to some cool questions. Okay. All right, let's go. Okay, so we moved to the second half of our booth. So after the microcontrollers, microprocessors, let's check what do we have for, let's say the sensors and other products. And I'm here with Zuzka. So Zuzka, could you tell us something about this great sensor which you have seen a little bit already when talking about the NanoHAI Studio? But here you see the hardware in action. So good morning, everyone. Here we are showcasing our new sensor. It is called ISM330IS and it offers accelerometer and gyroscope. But besides that, it also includes and so-called ISPU an intelligent sensor processing unit. This means this is a small but compact risk core capable of data processing with much less power consumption than MCU. So here on this booth, we are showcasing two demos. Specifically, we have here one demo is showcasing how we can run sensor fusion on directly in the sensor. So this means we calculate the orientation in the sensor and then the PLE system on chip only collects the data and transfers it directly to the smartphone app. Then we have here also another demo which is a combination of Tema Sensor and the ISPU. So it is a simulation of the front door and the ISPU here is specifically here to detect the state of the door. This means that if I open the door, then we can see it detected by red blinking LED. So I think this is it for introduction. Does SC have a long history during DSPs? DSPs, yeah, we do because basically since the processing power is needed for different types of applications. Even in the microcontrollers today, we try to include dedicated accelerators to help with the calculations or field filters, let's say the sinus, cosinus, correlations. So yes, we do. And the keeps coming new generations of this. Sensors. Yes, absolutely. First of all, we listen to our customers and I like this example of ISPU where we really integrate dedicated core to run the machine learning already on the sensor is already a big step through. Because in past, normally you had the sensor, you had the micro, you had the connection, PCV was bigger, more expensive. Now you can do these simple things on the one single device. Thank you Zuska, see you later. Okay, so let's move on. After some of the sensors, we do have as well here power dedicated to automotive solutions. And let's come back a little bit to the AI, but in the automotive world, which is a bit different from industrial. And I'm here together with Alessandro and Max and they will show us a few words about AI in automotive, the solution from SD. So guys, the stage is yours. Yes, thank you. So thank you very much and welcome. We are very pleased to present you this new solution for AI at the edge in the automotive market. And we are able to put a complete neural network inside a normal microcontroller, so no specific IC. But we come close to the microphone. But with a very standard microcontroller, this is our course for Mac, but we can also use our course one Mac. We're able to embed an LSTM neural network, which is able to analyze the status of the car while driving, analyzing stages like bumpy roads or normal drive or parking or anything else. And obviously this can serve as a base to build other neural network applications or on the edge without having to use any other power inside the car. What do we see here? What we see here is just a simple system, which is analyzing data coming from a sensor. We're using one of our accelerometer six axis. This data coming into the microcontroller is analyzed in time series of six seconds and every six seconds a response has been produced. So that's it. Road state monitoring is just one of the many automotive segments you're in. Road state monitoring is one of the new trends because obviously we all want to increase the safety for the driver. So for one side we want to increase the safety by monitoring the streets, but also by monitoring the status of health of the driver. So this data collection that here is simply done with an accelerometer can be expanded with several types of sensors and combined together to give a better picture of the status of health of the driver. So this is for sure a trend in the automotive today. Thank you guys. Thank you. Very good. Thank you. So yeah, let's move on to, let's say to the last part of our booth and here we'll see some solution around our NFC. So near for communication around ST-25 products and I will be very happy to say it. I spent some time here with my colleague, which can explain you why he is here. So nice boxes and the car and the solution. So please, sir. Sure, yeah. So what you can see here, what you can see here is a typical use case for one of our NFC applications. So NFC reader plus tag inside the small car, we have NFC reader with NFC reader antenna in the front. And in each of those blocks we have so-called NFC tags and what you can see here, if we start the tool for the toy, it starts driving and it detects automatically which kind of tag is inside. And inside the tag tells basically the reader what did you do and it drives accordingly. So the reason for this tool are the purposes to teach small children how to program first logical steps of programming basically. So NFC gives like a natural interaction. Exactly, exactly. So with NFC you can communicate over several centimeters. There are a lot of use cases, not only for dedicated NFC readers, but also communication with smartphones. So there's a lot of user experience, added value and added user experience. And it does the charging. Charging is also a use case of NFC, which is quite new actually. So we have here a proof of concept when you see charging. The big benefit is we have tiny antenna sizes. So we are talking about sizes on receiver side of 9.9 millimeter as example, can also be smaller and a power level of up to 1 watt. Targeting applications like smart glasses, smart drinks, hearing aids and similar applications. So it's different than the Qi? Yes, it's targeting smaller applications than Qi, as said, with low power requirements and smaller sizes. And it's just available now on sd.com? Yes, so we have a webpage for that, so sd.com slash NFC man is charging. You can check out our webinar and learn how you can implement NFC charging solutions. So here in your corner, it's possible to enable thousands of new games and ideas? Yes, that's the plan. That's the plan. So I really want to enable customers to bring new use cases into gaming for toys and also for other applications and markets. So without revealing any secrets, I'm sure you will hear about very cool projects that use this kind of technology. People doing new cool things with it. Yes, so lately there are a lot of different applications, board games, remote cars, similar applications like that, exactly. All right, it's very cool. Nice, and there's more NFC at the booth or what do you want to talk about? Exactly, so let's check this one because this is very interesting too. So, you see we are all the time very busy. Okay, so we can come back. So much stuff around here. Yeah, we can come back. You want to show everything? No, no, no. Okay, I really selected a few things, but maybe Patrick, I see he's available. Let's see if you want to say something? No. Okay, so let's go. Let's use the time before we have this interesting demo pod available. On this part of the booth, we don't show any products, but we show the STM Part2Q ecosystem. So, you know, today developing with the microcontroller, microprocessor, you want to start first selecting the right component for your application, then you want to start to integrate quickly the low-level drivers to make the configuration of the peripheries. And this is what we do with our tool, STM Part2Q Bemix. So, co-generation, exactly. So in the development flow of a software, let's say development, first you do the configuration of the basic peripheries. The QBemix will configure a project and generate for you the basic project on which you can start to build your own application. Then, of course, how and where to build the application and the debugging. For that, we have a list of several QBes and IR and Kyle IDEs. So, you as a customer can select between the advantage of the STM Part2Q IDEs. It is free of charge, supporting all the STM Part2Qs in one shot. It has also some new features for free RTOS Deepak Evernus, and we are also adding now the Azure RTOS, so the FedEx Deepak Evernus. Which is very important when you are working with the retail operating system to understand which task is running, which one is suspended. So, it helps a lot in the debugging of your system. So, if you go from Qube generation to Qube, sorry, code generation to code development and debugging. As developing an application, at some point, maybe you want to monitor what's happening in your software. And for that, we have an STM Part2Q monitor. So, it's part of the Qube ecosystem. And that one is based on a no-dread technology. And with this one, you can create really a flowchart, graphs, buttons to control in an intrusive way your software. Read variables, read content of registers to show them in a graphical and nice way. All this, the great benefit is it is fully free of charge available and always up-to-date with our latest products. So, that's about the ecosystem. Maybe one little last thing. Just before embedded board, we have released STM Part2Q MCU developer zone. So, what it is, maybe you are wondering. Basically, with this STM Part2Q MCU developer zone, what we try to do is to bring into one single entry point all the information about the products, software, middleware, tools, partners that you may need to know when starting to work with STM Part2Q. So, it's like entry point in intersection, you can continue your information and data research to be able to develop with STM Part2Q. It's a very big task to be community manager with that approach. Of course, the community itself is a very important part today of the support and development flow. On this side, we do have extensive community, we try to moderate and answer ourselves or within the community to all the questions from our developers and users. See if our colleague is available. I got one question here in the chat, just in case. That's okay. Please ask STM on their plans to include dedicated AI accelerators in the MCU. Yes, indeed. So, maybe you missed the beginning of our shooting, but yeah, at the time we were really looking after our solutions for artificial intelligence. So, in terms of hardware accelerators, today we don't have any dedicated one, but be sure on our portfolio we do have STM Part2Q, we dedicated accelerators for doing the artificial intelligence neural net for computation. So, more and more stuff is happening here. Definitely. We don't sleep, we go on, we carry on. So, stay tuned for new products. I'm sure we will be able to introduce you more in the coming weeks and months. How do people get involved, to become developers, find the companies and hire them, get to be an expert, we have everything for everything in that stuff, right? Well, I wouldn't become absolutely true if I would say we have everything. There's always something missing and something we can add more. But yeah, to start with STM Part2Q, today you find a lot of tools, samples, libraries, free of charge on our web. You find a lot of training materials. We have plenty of YouTube educational programs. The one I could recommend is STM Part2 learning page on our ST.com. We find a lot of links to YouTube, so-called MOOC sessions, where we start from the basics of what is microcontroller, how to select the right one, how to run the first code generation, how to start debugging, monitoring your application. We have dedicated videos and training programs on security, so how to implement a secure boot, how to implement graphics, HMIs and GUIs. So really, I'm quite confident to say today we have a lot of things available for you. So if someone wants to start developing, you should be able to find a lot. But at the end, in the same time, it's true that we still have a lot of things to do. And yeah, give us your feedback. Let us know your needs. And we will be very grateful for those and try to bring new things, products, solutions, trainings, materials for you. I need more discovery boards, more little board to train it, right? Definitely. So yes, you may remember that in past we were always supporting the community and developers with boards. We do it again this year, but it's not such a big scale. Because of the current market situation. All right. And you said it was something we can film right there, right? Yes. Yes, it's now available. So let's go. So yes, and this would be the end of our shooting in terms of demo boards. And now Rene is available. So Rene, could you say a little bit more about your demo and your product that you are showing here? So actually, we are showing NFC for consumer groups in general, also for access control or payment. But the main demo here is... So it's up in right here. Yeah. Just... But the main demo here is a new standard by the WPC. It's called Key Kitchen. And what we are showing here is the upgrade of a standard induction cooktop with NFC to enable appliances, kitchen appliances to go cordless by just placing them on. The induction hub. There is NFC communication for the power demand. If I turn on, the power is requested via NFC and transferred on the standard induction technology. It's also the vertical side in terms of placement. You can rotate it. It's very safe because there is no cord. What if it gets a little bit wet around or something? It doesn't matter actually because there are no cables, nothing, no electricity, no contacts. It's pretty safe. And even on a water kettle for example, of course it takes a while to cook up, but basically... So you just add NFC to standard induction. Exactly. So there's NFC reader in the induction plate, a receiver in the appliance those two are communicating for the power demand and in the end the requested power is delivered via the induction hub. And you can already see that the water is starting to boil in here. The kettle is getting hot, but actually below everything is staying cool as usual. And the nice thing is warm. It depends on not dangerous, stable on the full power, 2.2 kilowatt power delivery. Same like if you had a cable. Exactly, but just without the cable. The nice thing is that this is a dual function hub so it means you can use it for cooking, you can do your scrambled egg, whatever you want, but you can also put cordless appliance on top and the power is delivered wirelessly. And the key kitchen standard is also working to make hidden transmitters. If you, for example, have a dining table, you can place it hidden below the table and if you have a family come together and you put a cook on top of it, you don't need to plug it in. You can just drop it on the table and it's working immediately and totally safe. Nice, so NFC is really changing the world. Exactly. And you use it the correct way. We are upgrading actually existing applications with benefit for the end user, making it more convenient, more safe and especially if you think about the smaller kitchens. With the cable, you always have a hassle. We have to put it, you can drop over the cable in this case, we just drop it into the kitchen table and it's done. So very convenient actually. All right, it's an exciting future of smart homes. People have this built in and it used to be a common standard, right? It's actually a standard, as I said. It's discussed in the WPC, the same group which are doing the G-charging for mobile phones. And there's a dedicated group doing it, the key kitchen where this standard is discussed, hopefully released very soon and we will see for sure first appliances as well in the near future. I'll be a fun gift before this Christmas. I'm not sure. But basically I would get something of those for my kitchen for sure. Nice, cool. All right, so we are at the end, really thanks for watching, thanks for your questions. Charback, are there any last questions? It was a little bit emotional, as you show on the big screens, long time was passed and it's great to reconnect. Absolutely, now all the ST team was really looking forward to be here again, to meet our customers in face, to see the emotions, to have a discussion, chat all together because I'm sure all of us appreciate the remote way of working, teams, zoom and so on, but at some point we become a target of it and we need to come and meet together. So yes, we are definitely happy we are here and we are looking for the next additions to come, see you and bring you another improvements and new solutions from ST. I just got one question, it's asking, can it run Doom? I think so, why not? I think there are no limits. Cool, thanks a lot. Okay, thank you guys, ciao.