 Good afternoon everyone, I'm Alex Duc, and I'm Head of Research at your town that is a design house working in the field of the Internet of Things. We are in France, in Lyon, and we have some expertise in electronic design, RF, wireless devices, cloud application, also machine learning. And we are interested in software-defined radio, and we are also doing research, collaborative research, and the project that I will show you this afternoon is part of them. So, I will try to explain what our motivation is, why we want to use software-defined radio. And I will introduce the FET4FriO Plus H2020 project, and the wireless testbed for wireless devices, RF, and cognitive radio at EMEK. I will show how we can use this testbed, how we can access it and deploy your own radio software into it. And then I will show the first results of our experiments. So, we see that software-defined radio is popularizing, and the cost of the device has become lower and lower, and software library, like new radio, is maturing. I see that there is a lot of interest of SDR in academia, a lot of people, more and more work, more and more guys are using SDR in academia. And so, we would like to use it on our own devices and to put what academia is doing and to put it into industrial and commercial projects. So, we would like to use SDR for finger printing, passive authentication, like for pairing, and also localization. So, the main application domain that we are targeting is for manufacturing for wireless bench or wireless robot, autonomous vehicle in a rare host, for instance, and also for drone. So, we would like to authenticate this device while they are communicating and also localize them and to authenticate them according to the localization. So, for instance, if you have an autonomous vehicle that is moving into a warehouse, we would like to allow it to communicate and to send that out, leave it at a precise localization. So, we go to the paper and go to the state of the arts, and we see that there are relevant papers that is already achieving quite similar tool and quite similar research. And the last one is from the lab where I did my PhD, and we have some guys that's from our team, Thibovial, that also contributes to one of this paper. But, however, all of this paper does not study real-world IoT devices. Most of them don't rely on conventional communication protocols. Some of them are not reposable. Others provide data cells, but it is too small to then study them reliably. Others study fingerprinting, but the node doesn't move, so we can't learn localization. So, we would like to do our own experiments using off-the-shelf devices, and we need a wide, large testbed to collect enough data and do it in a reproducible way. So, we go through the different testbeds that are available, remote testbeds, open testbeds, and we see that there is the FED4-FRIEPLUS testbed, and that is a research project that offers a federation of different kinds of testbeds across all the Europe. And they also offer a continuous call for SME, so they provide small funds to allow SME to access the testbed, provide feedback, and do their own experiments on the testbeds. So, there are different kinds of testbeds that are part of the FED4-FRIEPLUS federation, and so different kinds, so wireless testbed, IoT testbed, CT lab testbed, also big data, HPC testbed, and cognitive radio testbeds. And the ones that particularly interest us is the wireless iLab testbed that is close to Brixel, that it provides IoT node, wireless node, cognitive radio node, software defined radio, and a lot of stuff that we could use. So, here is a picture of the wireless iLab testbed, so it's kind of a big warehouse where sensors and wireless nodes are put across the building. So, there are two testbeds, in fact. The first one is the wireless iLab one. It's mostly focused on sensor IoT devices, and the second one, more focused on the software defined radio and also on Wi-Fi sensor LTE devices. So, you can also put your own device into it. You can use it for LTE research, you can put your own base station into it, and use the hardware that is already in place. And that's this one that we will use, because there are robots, there are USRP, and a lot of devices that will show you a bit later. So, for the IoT devices that are deployed in the building, so, there are Zollertia with 2.5 GHz, with ultrawide bands. There are also ZigBee, and there are also some other kind of sensors from Nordic semiconductor, but it's only on the wireless iLab one and not on the wireless iLab two. And about software defined radio devices, there are different kinds of USRP and also HPG. And all of these devices can be remotely programmed, and you can access them remotely. So, in fact, each node is connected to a kind of laptop. It's usually an Intel node with an embedded PC, and you can access to this node, to this PC, and then access the node. There are also different nodes that are mobile, they are put into robots, and you can control the trajectory of the node. So, you can collect data and send data while you are moving the wireless nodes. So, here is the map of the different nodes that are located across the building. And you can see the different kinds of nodes, and you can select each node. So, how you can access this node? So, first time you can request, create an account on this web page on the iLab autoritory. So, it's open, and I think that you can just subscribe to it and fill the form, and they will probably ask you for the reason you want an account, what is your project, and then they will approve the account and you can access testbed. The next step is to browse the available device and to book them. So, you basically pick the device that you want, and you select them, and you reserve the nodes during the time you want. And then, they provide the software to access the nodes, that is GFit, so you install it. You log in, you select the node that you are, and you can then create the experiment. So, you select the node, you set up the image that you want to install, that you want to deploy. And then, you can access to a shell, to a remote cell through SSH. So, for about our experiment, we use the mobile node and the Nexus 6P device, the smartphone that is on the robot, because we can put into it our own application. So, we create an application that is broadcasting Bluetooth advertising packets and we then collect the data using software-defined radios, using this USRP. And we build a basic radio application to collect the IQ data to create our data sets. And we also rely on billed-dumped projects, that is an open source code that is demotivating Bluetooth advertising and collects them, so you can use it then into your shark. So, here is the basic flow graph. So, you see that we put first the IQ data into row 5 for further processing to create our data sets. And on the same time, we demodulate the data and create the packet. So, we have two outputs. First, the row IQ data and the second, the advertising packet into a pick-up. So, now about the scenario of experimentation. So, first we have the USRP that is receiving the data and the smartphone that is advertising. The smartphone can move and the USRP, the receiver, is fixed. We then use sequentially the difference emitter and at the same position with the same receiver and then we move them and we can use different receiver. And finally, we can use different smartphones at the same time to collect advertising for different devices at the same time. So, here is the capture of the testbed while we are running the experiment. So, we can access to a live camera to see what's happened in the testbed. And here is some picture of the recording of the data and we display the pick-up into a shark. And the thing that's above, it's the remote shell that's connected to the smartphone. So, about the next step, we will open source everything and write the communication so everybody will be able to reproduce this experiment and improve it. We will also publish our data sets and put everything into Zinodo. And then we will try different kind of RF protocol like ZigBee and maybe you'll try bonds. And we also will use extensively the robots to move all the nodes. Now, we have created a data set. So, the next step is to rely on another kind of testbed that is a virtual lab that is data center with GPU. And we would like to do some machine learning with a data set to fingerprint the device to identify each devices and also to localize them. And I think that's it. And we will also participate in the stage 2 of the experiment that is another kind of funding that is providing by the testbed and if you are also interested in participating to this project you can apply for funding. I think that you will be happy to have your own experiments into it and provide feedback to them. So, before ending, I would like to thank first the team at the wireless iLab that is supporting us during the experiment. Also, our team at Airtone, we are in fact three guys working on this project and if you want to follow our update about this project we will probably move to the European Nourage days in June in France. I think that there is a lot of advertising about this event during the day room but we will be happy to meet you there. So, I think that it's time for Christian's feedback comments and I will be happy to have your feedback about this project. Did you have any kind of direction finding with how it's implemented but the SPR you showed only has two RF inputs. Did you have this SPR direction finding of the RT device? Do you know where the RT device is located? Yeah, sure. And how did you do it with only two inputs on the radio? Two inputs on the... The US API you showed only two RF inputs. So, how does the localisation work? Yeah, in fact, we are using the IQ data for that. So, in fact, because we can move all the nodes, we can move them across all the building. So, applying machine learning into that, the fingerprint is different according to the localisation into the building. So, it's not... we don't do triangulation or un golf arrival or something like that. Yeah. We don't tend to have MIMO or antenna array as a receiver for that. Yeah. Yeah. Can you repeat loader, please? For doing what? For... You have your stack of servers at the end, presumably to do some IQ data processing, right? Yeah. Do you provide any software framework to assist with that process? Not yet, not yet, in fact. Yeah. Question here. Yeah. Do you plan on... Sub-egress sensors? Yes. I think that it's on the... on the... on our world map because there are some sensors in the... I think it's 100... 800... MHz. So, I think that's... we can try it. Yeah. Do you plan a real world validation with results afterwards because your results may be very specific to the environment or testing? Yeah. It's on the second. We just would like to... We just... The next step is to develop algorithm and everything and validate it in the test bed and then sure our goal is to move it to the real world. Yeah. Do you have any questions? Same gone? Let's take Alex again. Yeah.