 by Bayfi, he's already here, and he is working in cartography, so he's building maps with the help of cameras, but he also likes to repurpose things that he finds useful for other things, and that is also what he will be talking about in this talk here. It's about Wi-Fi broadcast, so he will show you how to convert standard Wi-Fi dongles into digital broadcast transmitters. Thank you very much, today I would like to present to you my work on modifying Wi-Fi dongles to serve purposes that they are not intended for by the Wi-Fi standard, and one example would be digital broadcast transmitters, but I will also mention some other examples that you could use them for in the way I intended. So coming to the contents, I will start with a motivation, because the obvious question is why would you even need to change something with Wi-Fi devices, because we're using them all day, they're working quite well. This is true for the intended applications, but there are a class of applications in which the Wi-Fi standard, well, fails pretty much, and this is what we will be addressing in this talk. Then after the motivation example, I will show you the basic principle, and building on top of that, I will introduce some improvements that I introduced to make such a broadcast transmission really bulletproof. And finally, I will give you some usage examples to show that it's really easy to use, and also give you some real video footage that has been transmitted using this broadcasting scheme. So coming to the motivation. So my personal motivation and a rather good example of an application for this technique is if you would want to build a free open source first person view drone. So this is, can be any type of drone I just depicted here, quadcopter, could be a land-based drone, it doesn't matter. The important part is that this drone has a camera attached to it, and it live streams the video down to the operator of the drone, and the operator flies this drone only by looking at this live stream. So there's no direct line of sight to the drone. So reliability is really important here in this application. And I imagine all of you have an idea how you would realize such a system. And actually, on first glance, it's pretty straightforward. Like you would just add some Wi-Fi hardware to the drone. You would create an access point with this Wi-Fi hardware, and then on the ground you have a laptop, and you connect to that access point, really simple. And then from the drone you would send down the data, video data, simply by UDP packets. And this looks fairly decent, right? It should work. And if you test it at home, then you will probably notice that it works really well. And then you go outside, start flying, having the time of your life, and suddenly, oops, you lost association. And this means, of course, you as an operator, you're instantly blindfolded. And good luck trying to rescue your drone in that situation. Well, you might think maybe it's not so bad. Wi-Fi usually automatically reconnects. And this is true. This might help you, might not help you. So yeah, good thing about this is you can directly go shopping parts for a second drone because your first drone will have already crashed by now. So in summary, association or stateful connection is something you do not really want in this application. So that's a problem of standard Wi-Fi because standard Wi-Fi usually uses associations. That's another problem I'm coming to right now. So I wrote here that we are using UDP packages. And this seems to be like a smart choice because it's unidirectional. So you just send data from the drone to the ground station and avoid all the hiccups of, let's say, stream-oriented protocols like TCP. So data could queue up. You need to send acknowledgment up to the drone, which is, of course, not really something you would want to do. So UDP seems to be a good choice. But in fact, it's not because under the hood, like on a network layer, it seems to be fine. But on a Mac layer of Wi-Fi, there will still be data flowing from the ground to the drone in the form of acknowledgments. So Wi-Fi uses acknowledgments. And so you actually need to acknowledge the package from the drone. And so you have a required upstream from the ground to the drone. And this is obviously something you would not want to have because then you rely on two links to be working perfectly just to get the data from the drone down to the operator. And there's another disadvantage of this bidirectionality, which is actually the term of the problem, that is that you ideally want to have asymmetrical setups. So what would you do to increase the range of the setup? You would, for example, install a power amplifier on the drone side so that the data gets spread further. But with a standard Wi-Fi, you would need to have the same power transmitter also on the ground station, which is, again, pretty pointless. And so bidirectionality is a problem with standard Wi-Fi. There are a couple of other things. So Wi-Fi has some automatic control mechanisms, like, for example, transmission rate control. So if the two devices are, well, at a certain distance, the signal receives signal strength, of course, drops at the receiver. And this triggers a mechanism that throttles or dials down the transmission data rate on the transmission side. And this happens automatically. This is not under your control. And so imagine you're trying to send a 10 megabits per second video signal, and suddenly the card decides to just switch to 5 megabits per second. Result will be data will queue up on the drone, latency will increase, you'll crash. Same problem. There's also automatic power control, which makes sense in standard Wi-Fi, because you want to limit the interference between close-by networks. So you just send with, let's say, a minimum required amount of energy. But since you rely so much here in this application on this link fully working at any time, you do not really want to have that. You want to use always maximum power in that case. And lastly, you have no signal degradation in standard Wi-Fi. So standard Wi-Fi uses CRC checksums and just either delivers a package that has matching checksum or no packets at all. And this is a problem, because let's just assume you fly further and further away from the ground station and at a certain point, the video feed will just stop because CRC checksums will turn bad. And this is actually almost a crash guarantee again, because you have no warning up front. It's just like, and it's off. And actually what you want is you want to see the transmission area. So this gives you a good hint that it's maybe time to turn around. And then you still have enough visual feedback to actually control the drone to move around. So signal degradation again, not possible in standard Wi-Fi. And yeah, this is kind of the motivation for me to kind of flip things upside down a bit. And now we are coming to the basic principle. It's actually quite simple. So you know, in standard Wi-Fi, you have the device classes, the station, like an access point where you connect to and the device that connects to this access point. And in Wi-Fi broadcast, the modes in which the devices work have been renamed by me. So they are simply a transmitter and a receiver. And the hardware used for that is just plain standard Wi-Fi dongles, eight euros per piece. So it's pretty cheap. And from this, let's say, mode name, you can already infer that it's a truly unidirectional data flow from transmitter to receiver. And these devices work in injection mode, in the transmitter case. So basically, you can send with that mode arbitrarily shaped packages into the year without any association. And the receiver runs a monitor mode, meaning it will pick up every packets that are floating, all the packets that are floating around in the year. And with an appropriate filter on the receiver side, you just get the packets from your transmitter. And this already establishes a very primitive but surprisingly good working link that does not share all the problems of standard Wi-Fi that I mentioned in the previous slide. In theory, this is super easy to do. The APIs for these special modes like injection mode and monitor mode are there already. You can just use them implemented in a couple of lines. And you should be good to go. In reality, the injection side is quite, let's say, quite a bit more complicated than anticipated because we're basically leaving here the domain of standardized Wi-Fi and are, well, hacking, fiddling somewhere outside of the standardized domain. And you have no guarantees how the hardware will react in this non-standardized way. So one problem I discovered was low injection rate. Many chipsets I tested really had terrible injection rates, like in some cases, in the order of 1% of the air data rate, I could only get through that. So this was pretty bad. I solved this by just selecting good chipsets, in a way. Then many of these drivers and firmwares I tested ignored quite crucial parameters. I requested them to obey too. So, for example, TxPower. I found many adapters ignoring this setting and just standing at a low minimal power value, which is, of course, not what we want, but simple, dirty, and don't look at the actual lines, but it's just a couple of lines. Kernel driver patch fixed that for me. Even more importantly, some devices ignored data rate requests. So I requested to send a packet at, I don't know, 54 megabits per second. And these drivers were always sending the packet at 1 megabit per second, which is not enough for video, for example. Luckily, there is one specific Wi-Fi dongle that I showed in the pictures earlier that has open source firmware. So I can just download that firmware, compile it, flesh it onto the Wi-Fi card, and actually, again, to have control over the data rate, it was just one line change in the firmware, and I could specify exactly the transmission parameters that I needed for my project. Now with all the troubles fixed, so to say, we again back at this basic scheme, and this works already quite well, so if you install this on your drone, there's no problem flying around a couple of hundred meters with such a setup without any special ingredients like amplifiers, big antennas, and so forth. But my initial motivation for this project was more to explore the world from the bird's eye perspective, so a couple of hundred meters didn't really get it. So I wanted to increase the possible range by any means, and one of these means is to just add more of these cheap dongles on the receiver side. You can just put them in there, and this will then enable you to do software diversity. And at first glance, you might think, well, this looks not really helpful. We have three times receivers that receive the same data stream from the receiver, so we will have only three copies of the same data at hand. What should we do with that? In reality, this actually helps quite a bit, and the reason for that is that in reality, you have multi-pass interference. So starting from this oversimplified transmission scheme, you will never encounter such a transmission here on Earth, maybe in space. But here on Earth, you have other objects that cause reflections of the signal, and these reflections will interfere at the receiver side, either constructively or destructively, and it's simply pure chance whether you will get a constructive or a destructive interference. And by placing several receivers at different locations, you basically can twist a bit the shape of this triangle here, and this basically helps you to get a better chance that at least one of the receivers will not suffer from destructive interference. And actually, in reality, this really helps a lot. There are other use cases of the software diversity. For example, you could use that to realize antenna diversity. So typically, if you look at these black antennas here, these are omnidirectional antennas giving you 360 degrees coverage, and this is already a good starting point, but with antenna diversity, you can add different antennas to different receivers. So for example, you could add to the 360 degrees antenna a very high-gain, long-range directional antenna. If you know, well, I'm just flying in that direction, long range, you can just combine the antennas depending on your needs. And if you really invest a bit more, you can just use lots of directional long-range antennas and realize, well, let's say, an antenna that's really not feasible to do with just antenna magic electronic means, so to say. And third use cases, of course, you can increase spatial coverage. So there are situations like this top-down view scenario where you have occlusions, which the transmitted signals cannot pass over. And in that case, you can simply place several receivers at different locations and the software will automatically fuse the signals from these receivers so that you get, in the end, only one logical data stream out of that, which does all the handover stuff and everything automatically from one stick to another. Let me quickly explain to you how this works. It's quite simple. So we have here an example with three cards and four packets. These arise, of course, consecutively. And let's just imagine, yeah, card zero received a package with a CSE error. So there's something wrong. We don't know exactly how much, at least one bit seems to be flipped, but could be not really severe, could be really severe, we don't know. But the other cards received good packets. So we just pick one of the two greens and we're fine and have a good packet here received. Packet two might be good on card zero, CSE error on card one, for example. And it might be completely missing for card two. In that case, of course, easy choice. We pick the green one, which is fine, and we have a good packet in the end. Now packet two might have a CSE, completely missing, and again a CSE. So what would we do here? Well, it's a tough choice, but the best thing we could do is to pick one of the packets with a CSE error. Maybe preferably the packet with the higher received signal strings. But besides that, there's not much more we could do. And packet three might be missing on all of the cards. And this typically happens when you have some external interference, like someone switches on a light, creates a spark, and this destroys the reception on all of your receivers. And again, there's nothing we can do about that. Now taking this combined stream of packets, this is, of course, not satisfactory. So this is still better than one card, but there are still some artifacts. And how could we deal with these? Well, simply by adding forward error correction to the data. And the way I implemented it is actually quite straightforward. So to these data packets, I simply add forward error correction packets, two or more. It's configurable depending on your needs and on your link quality. And of course, now for the first two packets, there's nothing to be done. They are good already. And now we are dealing with the broken packages. And we start with the worst case, which is the missing packet. We have no data at all here, so we can or we should start with that one. And so we apply the first FEC packet to the missing packet. And you can think of these FEC packets as, well, Joker cards, so to say. So you can recalculate one of the data packets by using this FEC packet. So this one is used up. And from that, we could reprocess the original value of this packet. And of course, we will then do the same thing with packet two. And now we have a good data stream again. Of course, this is now the optimal example. In reality, you sometimes have situations where you used up all FEC packets before you repaired all data packets. That can happen. If that happens too often, you should just dial up the number of FEC packages. And you should be good again. All right, so this was the basic principle. And now I'd just like to show you some usage examples of the tools that I developed that realizes this transfer mode. So first, a bit artificial example is a simple file transfer. So we first start the receiver because we want to capture all packages from the transceiver and not start in the middle. And this simply works by starting this RX program. This is part of the Wi-Fi broadcast software. And you just specify the Wi-Fi adapter that it should use. And in case of software diversity, you would just list several adapters and it would automatically do the software diversity under the hood. And what this does is basically all the data it receives, it will output on STD out. And you can pipe this into Image Magics display program. On the transmitter side, it's the same inverse. So you just output something to STD out file. In this case, a GIF file of your new drone. And you pipe that into the TX program of Wi-Fi broadcast. And as soon as you execute this command, it will be sent out into the air and picked up by the receiver and subsequently displayed by Image Magic. So pretty simple, but like I said, artificial setup. This is now an actual example that I'm using, for example, on my drone. So to transmit the video. So as a PC, I'm using a Raspberry Pi just sitting on the drone. And I start this command. So this looks a bit more complicated, but it's actually fairly simple. So the first part of this, from here to the pipe, is simply a standard Raspberry Pi tool that outputs H264 compressed video data on STD out. And again, you pipe that into the TX program and it will immediately be sent out into the air. There's no need to have any receiver enabled. It will just be emitted in the air. There's another alternative. So if you do not want to use Raspberry Pi, you could simply use a GStreamer API call, which is pretty much the same thing. So you capture from a video for Linux device, compress it to H264, and then output it to a standard output and pipe that into the TX program. On the receiving end, you can do again the same in reverse. You use the RX program, pipe that into GStreamer pipeline, and display that image onto your screen. And this is already a setup that is able to fly for you pretty well. Now, Wi-Fi broadcast is actually agnostic to the data it transports. So it's just you pipe data in and it falls out on the other side of the channel, so to say. And on its own, it's not very useful. And therefore, I developed some components that kind of complete the drone application. And one thing is, for example, I created Raspberry Pi image that you can simply burn onto two SD cards. You put these into your RX and TX Raspberry Pi and switch them on, and you have a video link running. So that's quite nice. And on the RX side, you have also support for recording. So if you just add a USB stick to the Raspberry Pi, you will automatically record the video of that transmission. I also developed OSD, stands for on-screen display. It's an overlay onto the video that shows some telemetry information of the drone, like battery status and so forth. And I also ported everything to Android. This was a bit more complicated than anticipated, just to give you some impressions. So this is what you could mount onto your drone, Raspberry Pi Zero, pretty small device, weighs only a couple of grams. And you get the camera already for Raspberry Pi foundation with it. And yeah, that's a pretty good drone setup that works even on the tiny drones. On the RX side, so this is now my clumsy self-made setup. On the left, you see my video goggles that are cut out of some foam piece. In the middle is the blue thing, the battery. And on the right, the gray box is a Raspberry Pi. And the white thing is, of course, the Wi-Fi receiver. And this was quite a nice setup. So I have the components in the pockets of my jacket, goggles on my head. And this is enough to fly around. This is an example of the OSD display. Here you see basically everything that might help you to safely get your drone back home, like receive signal strings, battery status, artificial horizon, distance to your home, and so forth. And we will see other examples of this in a second. This is a self-captured screenshot of my Android port. So the Wi-Fi broadcast camera looks onto the tablet, and the tablet shot the screenshot. And so you see this gives this nice recursive tunnel. Again, Wi-Fi don't connect it to the Android device. And this was quite a bit nastier than expected, because I needed to recompile the Android kernel, because it didn't support, of course, the Wi-Fi dongle. Then I needed to run Wi-Fi broadcast in a change-route environment, pipe that into NetCAD, which sends UDP packets to a local port. And then on the Android side, I had an app that receives data from the local port, decodes the H.264, and displays it on the screen. So it's not exactly user-friendly, but it works. So as a conclusion of my talk, I would like to show you a recording of a long-range video transmission. Hasn't been done by me, but it's another crazy dude, I would say. So what you see here is, again, the OSD display. And the video in the background is actually recorded on the ground. So this is the actual live footage that the drone pilot will see and use to control his drone. So this is a fixed-wing drone. You see here it accelerates to take off. And now we're in the air. And in a couple of seconds, we should see the antenna of the drone, which is nothing special. It's just here with this gray stick, which is a standard omnidirectional antenna. And now, if you take a look here at the bottom, this here is the distance to the operator. And as you can see, it's crystal clear video. In this case, I think it's 720p. And this is really quite a nice visual input for you to control your drone. Yeah, all right. I think I will let this running during the Q&A session. Thank you very much. And I'm happy to take your questions. Quite impressive indeed. So thank you very much, Bifi. Questions, please, to the microphones. We have one, two, and three. So come to the microphones. And we have a question on the internet, too. But we start with microphone, too. Hi. Thank you for your talk. You said that there would be graceful signal degradation. But can you also display the connection quality, for example, by how many FEC packets did I have to use? I'm not doing that one. But there's something similar. Like, if you look here up, the first number is the number of data blocks that could not be recovered fully by FEC packets. So yeah, I'm just wondering why this here isn't increasing. But yeah, so this gives you already quite a good indication. But I think your suggestion is also a good idea because it was triggered earlier than this one. So yeah, that would be certainly helpful. Thanks. Maybe the question from the internet, please. There's a practical question. What is the name of the Wi-Fi dongle you used? So I used this. Let me switch back to the picture here. So this white thing here is called TL-WN722N. If you buy this one, make sure to get the first revision because the later revisions redesigned the whole interior thing of the chip and of the dongle and use a different chip. Microphone 1, please. I'm totally blown away by the kind of range that you get there because the implication is that with essentially consumer hardware, if you go into the air, you could eavesdrop wise lands over 100 kilometers. Is that what you're saying? Well, the setup is asymmetrical. So the antenna that you saw in the beginning of the video that wouldn't have 100 kilometer range. It's an omnidirectional antenna. And this one is observed on the ground with a high gain directional antenna. So it would only work if you would install this high gain directional antenna on the aircraft. But in that case, you're totally right. You could observe Wi-Fi lands from 100 kilometers, probably. Right. Number three, please. I'd like to know which technology used to filter the packets from the receiver side. So you might sniff everything once you are in monitor mode. And I'd like to know what you are using to filter them, only the one from the video. That's BPF. So the packets I'm using have, well, especially crafted MAC address. And I'm just applying a BPF onto that. And this will do the trick. And yeah, there's also support. So for example, the telemetry is also sent, of course, with Wi-Fi broadcast. And Wi-Fi broadcast has a concept of ports to have several streams in parallel. And there again, BPF filters used to separate these streams. All right. Number two, please. OK. Hello. So one of the biggest implications in FPV flying, like the biggest deals for extreme or quick flying, is the delay you get on the video. So how does your digital video really clear and sharp vision compared to the analog FPV we have for many years? So what's the difference? Because I assume it's bigger. It is indeed. So I have an example here. This is what you get from analog FPV, quality-wise. Latency-wise, this is, I think, roughly 40 milliseconds. Wi-Fi broadcast is with the Raspberry Pi in the 100 millisecond range. So it's quite a bit slower than analog. But actually, Wi-Fi broadcast is not the cause of that. It's more the video compression. So Raspberry Pi uses, let's say, frame-based triggering. Like, you receive an image from the camera, and only once it's completely there, subsequent steps will be triggered. And ideally, you would kind of more pipeline the processing. Like, you start already when the first pixels arrive to process them. Unfortunately, with Raspberry Pi, this is not possible, because the video compression is closed-force, which is a bummer. And to this date, I have not yet found a better alternative that gives slower latency. I looked around quite a bit, and the only thing that would give better latency would be custom hardware, like, in the sense of where you have full control, like an FPGA thing. But I decided not to use this, because Wi-Fi broadcast should be approachable. Like, you should just go by an online shop of your choice, cheap components, assemble everything, and you're good to go. And FPGAs are a bit too special for that domain. OK, thank you. And another question you've shown on the slide, the Raspberry Pi Zero W. So you said it's only the Raspberry and the camera, but I assume you're not using the built-in Broadcom Wi-Fi chip. You're using the Tepiling, right? Correct. Yeah, thank you. OK, next question, number two. OK, yeah, thanks for the great talk. I also recognized the USB Wi-Fi dongle. I have one myself. And I was thinking, is the Atheros chip, the first hardware revision still manufactured, or are there just rest to be purchased? Actually, I haven't bought them for a while. So I don't know if they are still, you can still get them. Because if you go to a shop now or you don't get this device anymore. OK, there are some other alternatives. So for example, in the five gigahertz range, you can also use other chips. There is a link on my blog. It's linked to this talk if you're interested in that. And actually, it's worth mentioning if you intend to use this for drone applications, there are, I think, three or four forks of Wi-Fi broadcast already that just extended the scope of functionality by, I don't know, orders of magnitude. So you should also check out definitely these forks. So they are pretty well. Thank you. Next one. Yes, thank you for your talk. It was impressive to get a clear video signal for 10 kilometers. But one question is, how do you control the drone? It's another direction of your communication would interest me. So you can use Wi-Fi broadcasts for that as well. So you can run both the RX and TX instance of Wi-Fi broadcast on the same dongle in parallel that works. But I personally am a big fan of having the highest possible reliability on the control channel. So for that, I use frequency hopping transmitters like you can buy for RC from RC vendors. And these are really almost indestructible. Well, they interfere with the transmission sometimes, but there are ways around this. But yeah, highest reliability for control and then a bit below that for video feedback. That's just my personal gut feeling. OK, thank you. Thanks. And now our last question. Hi, regarding the error correction, you've shown that if you use one of the error correction packets, you can use it to restore a broken one or even a missing one. Now taking you have two broken ones, did you check it would be feasible to use these both and apply statistics to check if you can reconstruct it? So you mean I have two broken, meaning bad CRC packets? Yeah. And two FAC packets? Yeah, and basically, if you can use the ones with the broken CRC and don't use the forward error correction ones. OK, no, I didn't do that. So sorry, I have no info for that. OK, thank you. Thanks. OK, another last question. OK, so here's the last one. With traditional analog video, you have the possibility of choosing a channel that you want to transmit on so multiple people could fly at the same time. You mentioned before that in Viper broadcast, there's this notion of ports to separate different streams. How many ports could you currently support? Like how many drones in the air that can independently stream are supported? And is there the possibility of having more or less ports if you limit bandwidth? So indeed, if you limit bandwidth, you can transmit more in parallel. I wouldn't recommend to use ports to fly drones in parallel because one of the drones might not play nicely and use up more bandwidths, and then you have a problem. So I would rather recommend to use Wi-Fi channels to separate these. And actually, the white Wi-Fi dongle I've shown is quite capable. You can even detune it to transmit in 2.3 gigahertz. You shouldn't do that, but you can do it. OK, thank you. OK, thank you, Vefi, and a warm applause for you again. Thanks.