 All right, if you know what's good for you, now's the time you be quiet because this man is literally invited just so that I can watch him and he's awesome. So we will throw you out. All right, ladies and gentlemen, I would like to present Quadling, who normally yells, no, I would like to present Balint Sieber. He is literally invited every year so that all of us running the contest can stare in his direction with our mouths open wondering what the hell just happened. His bio is kind of lame, software engineer by training, Balint is a perpetual hacker, director of vulnerability research at Bastille Networks and the guy behind Spench.net. What you should really know is while most of you are discussing burner phones on Twitter, this man came into our village, I think it was three years ago, three years ago, and he said, I need to redirect a space probe real quick. Can anybody loan me a cell card? And he borrowed a cell card from a random stranger and beamed commands to something flying in space to try to redirect it around the planet. So yeah, if you all are scared of Defcon and he's like redirecting space, seriously, sit down, have a good old time, I would like to, with the proudest feeling in my chest, introduce Balint Sieber and hacking some more of the wireless world, an extension of his rant of hacking the wireless world last year. This will be cut off when he damn well feels like it or sometime late after lunch. So sit down and make yourselves at home, meet your neighbors and thank you Balint. Thank you for that very kind introduction, Zero. And thank you all for coming to the wireless village over some of the other excellent talks that are going on at the moment at Defcon. We just heard an interesting story from two Defcons ago. Is my voice level all right? Can you hear me okay? Up the back? Yeah? Thumbs up? Great. Where's Anders? Anders, where are you sitting? Anders is the man that let me tell you to his phone so that I could get on the internet and talk to the computer at Arrasibo, at the Arrasibo telescope to beam commands to IC3. So thank you very much Anders, once again. Trusting me with your internet connection. So my name's Balint and I'm work with the vulnerability research team at Bastille and I'm going to talk to you today about three things. Some of them I started introducing last year and I'd like to show you some of the additional work that I've done on them since then. One of them briefly will be in MASAT. Another one is implementing something that we saw at a cyber spectrum, previously cyber spectrum is a meetup that we run for people interested in sharing projects on software divine radio in the Bay Area in particular and then also some new work that I've done with radar and hopefully my aim today is that everybody understands a little bit more about how radar processing actually works. And so if you have any questions at any point please feel free to ask. I will leave you satisfied if I see nodding heads and smiles as opposed to quizzical looks. So let me know how that goes. And again most of it's done with GNU radio and I'll show you some flow graphs and the general concepts behind them. So the first item is going to be in MASAT and later on I'll also be encouraging audience participation. I've got some live demos here so that hopefully will go out with the God's willing. So just a very quick recap. If you're more interested in more of the details you can visit the last year's talk. The in MASAT is a constellation of birds and they offer a number of services one of which is in MASAT Aero which is used by aeroplanes that travel elsewhere not over terrestrial coverage of normal VHF ground networks but if they're particularly going over the oceans. And in MASAT is a bent pipe design so transmissions will be sent from aircraft to ground stations and back. It's actually pretty easy to listen to this on the C-band downlink. You can just get a dish, make your own feed, use an LNA and a bandpass filter and then you're able to decode at least the P-channel which is what I looked at to begin with to decode basically A-cars star messages going over this link. And I have to give credit here to my friend Ian Buckley. This is a setup that he made so that we could receive these signals. So again I won't go through the detail I covered it last time but you can start demodulating it. It's a simple GMSK signal and then you can work through the process to actually turn what appears to be noise into intelligible data. So the first step would be deinterleaving and now I want you to remember the idea of an interleaver or deinterleaver. So the idea is that you read data into the rows and then once the memory cell is full then you read it out from the columns. And the purpose of this for the transmitter is to redistribute the information bits over time. So if you have burst noise source then it won't knock out all the bits in a single packet. Those missed bits will be distributed over time through many packets and then you can use convolutional decoding or forward error correction to fix those individual erroneous bits over time and therefore you get a much better chance of recovering any corrupted data. Once you do the forward error correction, some descrambling and then things look much more structured. And then on the P channel here, this was a coordination channel and some of the messages are actually user messages that contain text information. And if you're a plane spotter and you like to listen to A cars, this is essentially the equivalent but coming down from a geostationary satellite. And it's interesting because there's a slightly different flavor to them in terms of navigating and sending directions to planes which are anywhere above the Pacific for example. So just some examples to give you a feel. Random human readable text messages to the crew. You've got weather reports in standard notation there. A different form of weather report commonly used in aviation. You've got AFN, ADSC, CPDLC which are different sorts of encodings for messages that pertain to routing and traffic flow and various other sorts of things. And then this is a good one that I saw. Remember the Galaxy Note 7? They would remind everybody over the InMarsat error P channel as well. And then a lot of these messages are also printed out on the cockpit printer or a center of the captain. So informing them about preferred routes for their flight plan. This one is a custom message saying that there's really bad weather somewhere. And this one is interesting because it's actually a step by step to I think reset one of the displays or subsystems on the aircraft or on the cockpit. This one actually would be a message that you print out. And so just before they close the doors, you know, they print out the manifest and the pilot, the captain signs it. And so they would, I'm guessing, print this out there and then the captain would sign it on the line there. Here you'll recognize these numbers. Who has a guess as to what these codes are? AKE 91536QF. Any guesses? I don't think they're town numbers. I think there's something else. No, not the OOA points. And I might be wrong here, but I think that they're the codes for the luggage cargo containers. Because if you look on the side, they're all uniquely coded. So I think these are some of the containers that they have in the belly of the plane. Here, there's an alert. The gross takeoff weight exceeds some threshold. You know, again, routing information, reduce the delay. And this is interesting. I called Gateway and had them bring the fuel up to some value to take into consideration the point-through fuel burn increase. So they're modeling different aspects of the flight. This is actually data regarding all sorts of different subsystems. So the auto break and various other telemetry that's sent back to airline operations and manufacturers so they can keep track of the equipment. Because a lot of the stuff there, like the engines too, are all fly by the hour. So that's all sent back to Rolls-Royce or other manufacturers. And then various things that are left to the discretion of the captain as to take off or what have you. So that's all well and good. And I also spoke previously about decoding all eight channels at the same time. And once you do that, then you can actually amass quite a lot of information. And you might not want to read all that stuff. But maybe you want to actually have a look at some other stuff, which is what was just pointed out here, the waypoints that are part of the flight plans. Because if you look at this, you actually see there, there are North and West coordinates, latitude, longitude. And then there are these five letter character combinations. Anybody have any ideas what these five character codes are? That's right. They navigational waypoints on aviation navigation charts that are established around the world. So they often don't actually send the raw latitude, longitude coordinates. They just summarize it with these five letter codes, because that's well defined. And so what you can do there is pass that and plot that. So at any point when the blue arrow appears, I'm going to show you something outside the presentation. If for some reason I get caught up and forget about the blue arrow and it's there and I haven't shown you whatever I intended to, yell at me and I'll switch. So you can amass all that data and then parse it and then get something like this. So I parse all that information and then you can convert it into KML for displaying Google Earth. And the satellite that I was listening to is actually servicing the Pacific Ocean area. And so you would expect that all the most of the information flowing through will be regarding planes flying over the Pacific. And so if you zoom out, you can kind of almost get a feel for the satellite footprint. And these are all the flight plans that have been sent to all of the aircraft flying through. So it's particularly busy over Japan, say. And if you zoom in a little bit more, then you get all the five character codes coming up. And you can see by the here, I guess this is the registration number for the aircraft. And so you can see what flight plan that was actually taking. I think the red dots are the dots that are actually sent with the raw latitude and longitude coordinates. And you can parse that out. So just to show you what that looks like in the raw, the output of the GNU radio decoder, it just dumps all this raw information. And then you have the frequency here because there are eight channels. And so if you just parse that a little bit, you can get those sorts of excerpts that I showed you before. And what I did was I just googled these five letter codes. And then through various matches online, you find various listings for these for different areas. So for Russian space, there was this kind of text listing. So I just throw that in a Python file. And then at the bottom, I just go through and parse that. And it builds up a list of waypoints, same for Hong Kong, Japan, Taiwan actually had a JavaScript listing. So I did a find and replace to make it parsable in Python and then build a sort of a self-reflexive object to parse all that. And then the waypoints file just includes all the points. And then the parser takes the output of this file and then looks for all those coordinates in five-digit, five-character waypoints and then just spits out the KML. So it's kind of interesting to see the system actually working and visualise that way. So that was the update on InMAS. Any questions about that before I move on to the next thing? So the next thing is actually also aviation-related. It's an implementation of what's been termed as an unselective AM receiver. And I have to give full credit to Kevin Reed, who at Cyber Spectrum Number 15 demonstrated this with his shiny SDR platform, which is a really neat web-based SDR receiver so that you can use it remotely from the actual hardware. And what he did and is documented this on his blog there, which I encourage you to have a look at, is do a really nifty sort of reversal of the common AM demodulator concept. So switchborg on Twitter. So with a normal AMD modulator, if you want to listen to aircraft, listen to interactions between pilots and the tower, you know, for takeoff or approach, taxiing, if you're a bit of a plane spotter, then usually you have a receiver, so a software defined radio. And with that thick arrow there, that's receiving a very large chunk of the spectrum. So say you might be receiving, I don't know, 10 megahertz worth, so you can get a nice big chunk of the arrow spectrum and see all the little transmissions there. And then you select a channel to listen to, and so that gets dance sampled and filtered. And then you go complex to mag. Actually, that should be a thin arrow there. I didn't fix that. And then what happens is once you do the AMD mod, which is really this step here of complex to magnitude, then you output that to your speaker and you hear the audio. What Kevin Reed did was that he swapped these two here. So what he did was he swapped the dance sampling and filtering with the complex to mag. And so what that means is you essentially take the magnitude of your entire AM spectrum that you're receiving, like 10 megahertz worth, and then you dance sample and filter the entire thing. So you never actually select a channel to dance sample to you. You demodulate the entire band. But because it's AM, the side effect is you end up hearing the strongest transmitter in the band, which is really neat. Thanks, Russ. Thanks, Russ. There we go. So the idea here is that you end up hearing the strongest transmission. And this works because it's AM, right? You're demodulating AM because you're simply looking at the power level of your carrier wave. And if you treat the entire spectrum as your carrier wave, then the strongest signal there will cause the largest ripples, and that's what you'll end up hearing. So he swapped them like this. And this is actually really easy to implement. He did something that was really clever, which was take the receive output, the raw complex baseband, and then apply a slope filter to the spectrum. So one path you take out, and you say, I want to have the left hand side of the spectrum louder than the right. And the other one, you want the right side of the spectrum to be louder than the left. And then once you've shaped your spectrum like that, you do the same process. So you do complex to MAG, or AM, demodulate the entire spectrum down sample, and then you play that into the left speaker and the right speaker respectively. So what ends up happening is stuff that's on the left hand side of the spectrum, you end up hearing out of the left speaker and stuff that's on the right side of the spectrum you hear out of the right speaker. So it spatializes the entire AM spectrum for you. And if you're wearing headphones, it sounds really cool. I've got some speakers here. So in a moment, I might ask for absolute quiet because they're not very loud, and they need to be stereo. But I'm going to show you a video here, and then I'll show you it interactively. Now, just before I show you the video, I'll just explain what's going on here. So this is the flow graph, and this is the whole spectrum. And this is a representation of the frequency response of those shaping filters. So this is the raw baseband that's coming in, and this is that shaping. And so there's a blue one and a green one, and that's the left and the right channel. Now, the reason why it has these notches in there, and you see those notches reflected on the baseband, is because often you've got birdies or spurs, which are either internal to the receiver or from man-made noise or from VORs, which are always transmitting something, or just other things that you don't want to listen to. And sometimes they can be powerful, and they will drown out the legitimate transmissions that you want to listen to. And so I thought it would be cool if you could interactively notch out those frequencies that you didn't want to listen to. And so what this does is, and I'll show you, you can click on the baseband spectrum on the things that you want to notch out. It'll recompute the filter with those notches included, and then apply that to the left and right slopes. And it does it globally, so you can change the frequency of your SDR, and then it'll actually recompute everything that's in your passband, and always keep the stuff out that you don't want to listen to. So if I could have as quiet as you can, I'll turn the volume up here so you can hopefully hear it. And so I'm just going to, oh there we go. So notice I clicked on that and the tone went away? I don't know if you can get the stereo effect, hopefully you can. So see I clicked there, it applied the notch, and then recomputed that. And I'm changing the baseband frequency, and so it's shifting all the notches along to keep them on the right frequency. And so I moved to where there's eight cars, and it's very loud and powerful, you don't want to listen to it, so I notch that out too. So it's cool, right? Because I'm not actually clicking on the channel that I want to listen to. It just all comes in, and you hear the most powerful thing. And then what's nice is that you can click on remove if you want to remove a notch, and then click on the baseband spectrum, and it'll actually remove that notch, recompute the filter, and then you get, you know, in this case the unwanted tone back. So that's that. Now the thing is I mentioned that that slope filter, and it wasn't immediately obvious to me how you generate that. And it turns out it's actually really, really easy. This is done in a single line of Python in a GRC block that I'll show you. Basically you use NumPy, and use lint space to create a ramp between zero and one over the number of taps that you want. You shift the, do an FFT shift, so it basically rearranges, you know, one half and the other half back to back. You do an inverse for your transform, and then you shift it back. So what you've done is essentially you say I want this frequency response, and then you take the inverse FFT, and then it gives you the time domain taps that you can apply in a normal FIR filter. And that's all it is. It's really basic. So I've got a blue arrow which means show you the actual thing. Now any, any questions so far? Who actually listens to the arrow band and, and listens to cockpit and tower and all that kind of stuff? A few of you. All right. Was there a question over there? Just curious, with the interactive notch, you could do equalization to null out tones that are continuous, but is that too computationally intensive across that wide spectrum? So that's a very good question. So the, oh, I don't have to repeat the question because you spoke into the mic usually. Yeah. So if you remember on that line when it was using a linspace and had the taps count, you can decrease the number of taps if you're using a higher bandwidth, because naturally, yeah, the larger bandwidth, like this thing works really hard. I think I do 6.25 megahertz off, off of you, sir. And it, it makes it work pretty hard. And I chose a higher tap counter that would look really good in the frequency response plot. But if you, if you make that much smaller, it looks far more ugly. The response is obviously ripple, but it works. Okay. So you can manage it that way. So let me find the flow graph here. I'm going to have a resolution problem. So let me just transfer this, make the thing fit. All right. So I'm looking for AM unselected. I'm going to just do a live demo. Oh, no, I've pretty recorded them. So that's, that's that. So at the moment, I've got a single much in there. And if I take it out, you can, you can hear the tone, right? You don't want that. So let's, and if you watch in the console there, I've got this notches.json that contains the list. So if I click this, then it'll load the, load the notches, calculate the, the complex, what did I do? A complex filter and then rotate it up to the frequency, the baseband frequency that I selected and then apply that here. And what's nice is that you can also change the number of taps. So if I made it 13 instead of 512, then obviously it changes the, the look of your frequency response. And one other thing, Ian, my friend Ian Buckley observed this, that if you actually add a delay between the left and the right channel, especially if you're on headphones that it makes it, it emphasizes the stereo effect even more. I don't know whether we'll necessarily be able to tell here, but if we make, I think this looks like, yeah, you'll need to play with this yourself, but you can add a delay to emphasize that. And you can mute the left, mute the right. And if you look at the, the actual response of each, the output of each filter, you can see the baseband plus the filter showing that characteristic, you know, roll off to one side and the other one. And the green lines that are appearing are just basically selecting the, the signal VFFT bin that has the strongest energy in it. So that, that, that's most likely the, the audio that you're hearing in that frequency, which is being updated at the bottom there, max frequency, right, then left. So again, this is just the implementation. The original idea was Kevin's, but it's, you know, I thought it might be fun to implement in good radio. Um, I can take you through the flow graph very quickly. Um, I've got a file source here and then you get your complex baseband output. And as I mentioned, you, oops, you take one line up to one filter that contains the right channel taps and then you take the baseband to the other filter that has the left channel taps. And what's nice actually is that to compute the, you only need to compute one set of taps because the other channel is just the same taps, but in reverse, which is, which is handy. And then once you have done the, the filtering, you immediately do the complex to mag, which is the AMD mod step. And then you resample it to match the sound parade of your sound card. I've got a DC block in there as well to remove any, any DC from the AMD modulated signal. I've got an AGC so that the audio level that comes up, the speaker is always more or less normalized. A low pass filter so that you, you know, AM is quite narrow band here. So just to focus on that one, one channel at the audio, audio passband, and then it just goes into one channel of the audio sync. And that's all there is to it. The additional complexity happens more with that, the interactive thing. So the graph there is brought about by, what do you got here? There's this sort of real time graph in that plot, the wrapper I made. And so using an any code block, you just implement the graph like that and it'll cause it to display. And then every time, every time here that variable dependencies taps left or taps right changes, this line means create a callback for those variables change, then it does this Python, actually this is multiple lines, but separated by semicolons. So the first part of it is use sci-pi to calculate the frequency response of the left taps and then calculate the frequency response of the right taps and then set the data on the real time graph with the FFT shifted log 10 magnitude of the frequency response. And so that line, whatever the taps change will update the graph. And then in terms of actually clicking on stuff, there's a callback mechanism here. You know, with these FFT GUI syncs, you can specify a variable to take the frequency that you clicked on. So when that variable changes over here, then I have a little bit of Python that it's a class and it has this add not function in it and so it takes the click frequency. And if remove is disabled, then it adds the notch. And if remove is enabled, then it removes the notch at that frequency. And so there's just a little bit of Python that manages the global database of values. And once that happens, I think it's here. So the tap generator class is initialized here and it basically takes the initial set of taps, which is just that initial left and right slope. And then self.taps is I think a reference to the actual taps that the generator will update. So the generator will generate new taps and then call a callback registered within GRC to update the taps. And then it will update in the FFT filter and update on the graph. And that's it really. So conceptually at least not, you know, it's pretty elegant. And again, that's props to Kevin for figuring that out. Any questions on any of that so far? Okay. All right. So next on the agenda, this is the third and final part, is radar and particularly FMCW radar. I spoke a little bit about this last year, but I'd like to show you some more and explain some more concepts to you as well. And to do a bit of a recap, who's seen these primary radars at airports when you fly, you know, you see them rotating there. This is a basic kind of CW radar. And if you watch here, as it rotates, I'm sitting on a hill with a user on the frequency. And on the left-hand side, it's plotting the magnitude of the signal received. And you can see that it's triggering, played again, it's triggering on the initial transmission from the radar. And the radar then switches to receive mode to listen to any echoes that are coming back from, in this case, airplanes. And you can see there that there's also some responses coming back very close in time. So I'm picking up both the initial burst of the radar and what turns out to be ground clutter, mainly nearby. And so what is the radar return and how does that work? So this is an example return down the bottom, just complex IQ that's been, again, turned into AM, essentially, the magnitude. And the way it works is that the radar sends out what's called a bang, which is that initial short pulse. And in this case, it's CW. So it's just a continuous wave at a single frequency. And then the radar switches to receive very quickly. And then it waits for these echoes to come back because the radio energy will reflect off, for example, aircraft fuselages. And some of that energy, just a small portion, will come back and be collected by that massive dish that's rotating on the radar. And it's so massive because the signals are so weak and so you want a lot of gain at the antenna. And then once that period has elapsed, which is the pulse repetition period, then a new pulse is sent out. And then the process just continues. And this can happen hundreds or thousands of times per second. And there is some radar concepts which are important to get a hold of. One is that pulse repetition interval or frequency. So that's the amount of time or how many times per second that pulse is sent out. And then the wait period. I'm going to call it PRF, which is the number of times that happens per second. And the width of the actual transmitted pulse. So for how long the radar is actually transmitting. That's also important. So the way you actually figure out how far an object is away from you is by looking at the delay between when the signal went out and then when you had the signal come back. And this is called the round trip time. I'll explain that in a little bit more detail in a moment. But if you consider these two diagrams here from RF cafe, these thick black lines here are when the transmitted pulse were sent out. And then you have this return here that comes back. And there's this notion of unambiguous and ambiguous returns in radar. So in the this diagram, the a diagram, you have a return that occurs before the next transmitted pulse. And you can then say that this is an unambiguous return because it happens before the next pulse. In the B diagram, the real range actually occurs after the second pulse. So that's in excess of that interval there. And so if you didn't know any better, you would assume that this target is actually really close to your radar where in actual fact, it's even further away. And so you get that sort of false return. And this is ambiguous range. And so what's important to note is that the way you design the parameters of your radar will constrain you and will set the unambiguous range that you can compute. And so the easy way to think about it is the round trip time times the speed of light divided by two is your virtual range. It's not the real range. It's the virtual range. So just keep that in mind. Now with that radar that you saw spinning the primary surveillance radar, I did a recording and then each of these scan lines is triggered, there's this red part on the left hand side, when it hears a really loud burst which is attributed to the radar doing its transmission. And then it keeps listening for a fixed period of time for the return. So this is all me sitting on the hillside with my little USUK with a just with antenna, not a big radar dish. And then it captures a fixed number of samples and then it waits for the next to read our transmission as it rotates. And what amazed me is that you can actually still pick up with a little with antenna all of these structures that are a little bit further out from the ground clutter which is causing these immediate returns on the left hand side. And so this is just each scan line and then I just recorded it for a number of rotations. I think this is actually just a single rotation. This is 50 mega samples per second capture into Rambisk. And what's nice is that taking that linear plot and if you unwrap it into polar space and place it on the radar position itself, this is the bay area here, you can see that the ground clutter actually lines up quite neatly with the ground and not in the bay. What blew my mind is that you actually get returns from the power pylons that cross the bay there and also from the bridge. So these large structures return enough energy to be received by just a whip antenna on their user. So that's the simple concept of CW radar returns. And a bit of theory here from Wikipedia, this is to illustrate why CW radar is not the best kind of radar. There are better forms. So again this is the plot where you have time and the red CW tone here, the brief one is what's transmitted by the radar and these blue patches here are the echoes that come back. And so if you take a match filter which is in this case just a filter for the frequency that was sent out, then that means you can ignore everything else. And if you apply the filter then you get this plot on the right. So you ignore the transmitted pulse and then the peak of the response here is at the time where that target was and then this other target is further out and so you get a lower amplitude response but again you get the peak there. And this is not the best right because you have this ramp up and ramp down and you like to have much more accurate definition in time and the time domain of where that target response actually is. And the bigger issue is that if you have two targets that are close together like this so those two targets are now not separated by as much distance and therefore time, then the response, the echo actually blurs together. And so now it looks like there's only a single target if you apply a naive detection strategy. So that's why CW has these drawbacks. And a simple alternative and there are many other radar alternatives is FMCW and what that means is it's frequency modulated carrier wave. So instead of just sending out a fixed carrier wave you actually send out a carrier wave that increases in frequency or might decrease in frequency. And this is what it looks like in a time domain. So you start at a low frequency and then you quickly ramp up and it might be linear, it might not be, but you can ramp up and by the end of your transmission you're actually transmitting a high frequency. And the reason why FMCW is good is because these chirps, this is known as a chirp when you have something increasing or decreasing in frequency, it has the property of strong self-correlation. It's got a really good auto correlation property. And what that means is if you take that signal and you mix it with a copy of itself then you'll get a really good response. But if you mix it with anything else then you get a really bad response, which is not the case for what we just saw with the CW tone because you get this mixing here and the ambiguous response. So I have a demo for that. We're going to actually listen to what a chirp is so that you can get a better idea of what that is like and you can see it. So this is again in the audio domain. I want to emphasize that these are general concepts and these are transferable between audio and radio and other things. So you'll hear the chirp and this is actually the response from my microphone and my laptop. So as I'm speaking you can see there, does anybody want to whistle or make a tone? There you go. Just to prove to you that it's a line. And now I'm going to let you hear the tone and you'll see it on the spectrum. So you can't even hear that but the microphone and the speaker is still generating a response. Okay, so this is a chirp. It's a very slow chirp but it's a chirp nonetheless. The radar chirps are much much quicker and I'll show you what that sounds like and looks like in a moment. So this is the frequency response in the time domain. A common construction that you use is you use a sawtooth wave. So usually a signal source and you might have a sine wave but here I'm using a sawtooth and I hook that up to a VCO to a voltage controlled oscillator. So what I'm effectively doing is I'm putting in a linear ramp, a linear voltage control and then the VCO will turn that into a frequency modulated continuous wave and we can see what that looks like here. So I'm going to speed up the chirp here with the, where is it? Do I not have that explosive? Delay, multiplier, low pass cutoff. Ah, here. So this will take 10 seconds to do the chirp. I'm going to make it do it every second. Right? Now I'm going to make it do it 10 times a second. Okay? Now if you look at it in the time domain, this is the output of the sawtooth generator, right? So this is not the frequency output yet, the FMCW output, it's just controlling the VCO. Once you feed that through the VCO, it looks like this. So let me slow that down so you can see a little bit better. This is the time domain plot of the complex output of the VCO block and you can see that you get a complex sine wave, sine your side, but it's increasing in frequency and it resets. So this is the basis of a chirp radar. We're listening to the chirp here. So again, this should be familiar to you now. We saw it in the complex domain, but here it's in the real domain. Now in this case, we've been listening to a full duty cycle chirp generator. So it's constantly generating the chirp and as soon as one finishes, the next one starts. In the radar scenario we looked at before, we sent out a chirp, we were quiet for a while, and then we sent out another chirp. You can do both. They're both equally valid. They just require different sorts of hardware configurations. I'm going to start talking about the full duty cycle continuous version now. Now the other thing to remember here, and this is really important too, is that when we talk about filters, filters need taps, right? Before with the arrow, you know, the left-right channel, unselective receiver, we generated two series of taps to find two filters and they impose their shape on the spectrum. Here we're going to consider a filter whose taps are the chirp. So usually there are low pass filters and high pass filters, and you can think about how they might shape the spectrum. You can make a filter of whatever you want. You can even use filters in the digital domain for the access control word or a preamble for a digital packet. Here our filter is actually going to be the chirp itself. So the idea is that whenever it hears a chirp and then you put a filter through it, you'll get the maximum response from your filter when the incoming chirp lines up with the chirp that we've defined in the filter. So, and this is what makes it, I think, really click. This is like the diagram we had before. So we have the transmitted chirp here, and we have these two echoes, right? Because the echoes are going to be the reflection of what we transmitted. And because we're using this, the special filter that has this great auto correlation property, we get a really nice, well-defined peak here in time that lines up with the response, the exact response echo that came back here. So you get the first target and the second target. And this is the really cool thing. Then all this noise is added, right? This is basically swamping out, completely swamping out the echoes. And amazingly, when you pass that filter with the chirp through it, the noise disappears, and still you're left with the responses, the peaks at the two targets. And even if they were overlaid on one another, like we had in the previous plot with just the CW, you would still be able to disambiguate those two targets. So FMCW is pretty simple, like you can use radar techniques that use coded transmissions. But this is like an in-between level where it's still fairly doable with easy construction, even in GRC. So any questions so far? Question? No question about the mapping. How did you get the, I guess, rotational rate of that radar to project it in the, this is going back a couple seconds. Yeah, yeah, no, no. I guessed. I mean, I recorded, I think I can record 70 seconds to 16 gigabytes of RAM. And I think I guessed. The other one was, does the distance from the actual radar receiver, you on the hill, give like some kind of distortion to? Yes, yeah, exactly. So I can show you later. Okay. But if anyone's interested in this notion of the distortion and the actual physical model of the radar system, and if you consider the path propagation, I'll talk about that just briefly. But in a Blackhack talk, I did a couple of years back, I show a slide where actually I wrote a program to create a rasterization, a visualization of that distortion map. So you basically give it the offset and angle between the receiver and the transmitter, and then it'll calculate a distortion grid that I didn't do it, but you would then apply that to your returns and then it would undistort everything. So that's possible to do. But that's a very good point in that we're always talking about virtual range and virtual time echoes. And so you have to keep in mind how the geometry of your radar is set up. Next question. Could you just go back a slide? Yeah. So there you've shown the filtering. This is from Wikipedia, by the way. Yeah, I realized that I'm going to ask the question. The previous Wikipedia slide had the two pulses really close together. Yep. Is there a limit for the chirp like decoding, filtering the chirp in terms of frequency, or is it something else? Can you to be able to disambiguate? Yes. So again, that's all set and constrained by the parameters of your radar system. So how wide the chirp is, what your signal bandwidth is, what your sample rate at the ADC is, all that stuff. And I've got a little Jupiter notebook that I'll put online and you can enter in all of those parameters and it'll tell you what all the constraints of your radar system is. It's just because the Wikipedia page showed that previous kind of signals close together, it was intriguing to see what the parameters were. Yeah. I mean, that was some simulation that someone put up. But again, it's a lot to the numbers. Just a quickie. I mean, would it not a correlation of giving you that rotation rate, if it may? Aside from just go look at it and get it? Yes. Yeah, it would. Yeah, yeah. I mean, you can, it's, I don't know, each capture's like, I mean, it's like 15 gigs or something. So you could do it. It would take a little while. But that's exactly right. The point is raised that if you wanted to figure out the rotation rate, I guess, because I didn't want to do it. But what you would do completely correct is you would run an auto correlation with the, the capture and do a correlation with itself. And as soon as the similar kind of features match up again, you'll get a response at the time where the, where the radar has resumed and completed a single rotation. Thank you very much. Any other questions? That's, that's great. Thank you for these questions. It's really good. No, I'll move on then. So, so with this continuous or full duty cycle radar system, you have your chirp generator and that's transmitted. So as you heard here, like it's being transmitted out of the speakers. And then the receiver, in this case, in that demonstration, it was the microphone in my laptop. And there, I then mixed the input with the chirp, right? And because we're receiving and generating this simultaneously on the same hardware, it means that the clocks are synchronized because I'm using the audio hardware that doesn't allow you to start transmitting and receiving the streams at the same time. That's different on, say, a USRP, which I'll show you next. And that's why I have this delay slot factor that you can change and I'll show you that in a minute. But you mix the two together and then you get the D chirp signal. And the D chirp signal holds wonders that I'll, I'll try to explain to you, but it means you can do some really sophisticated processing in a, in a pretty simple way and get much more rich information out about what's going on in the, in the space around you. So what's important to note here is that once you D chirp the receive signal, then you get constant tones. So when it D chirps the, when it D chirps itself, it'll, you'll just get a DC tone. And then any other reflections will end up generating other tones at different frequencies. And that's the key there. The cool thing about chirp radar is that once you D chirp stuff, your reflections become tones at different frequencies and that frequency is controlled by the distance from your radar, which is kind of mind blowing when you think about it. It's just a really nice way all the math kind of falls out. And then you can just use FFTs and then do a cool stuff. I don't think so because you're still just doing it on the basis of the responses of the filter. So you're still looking for a peak response. I mean, you can send out any kind of signal you wanted to. And I'll demonstrate that actually. The last thing I'll show you is passive radar using exactly these concepts using digital television signals off aeroplanes. So stick around if you want to see that. That's what this is building up to. So what's interesting is that in, you know, properly deployed radar systems, what often happens is that there's more sophisticated RF plumbing. So when you have a receiver and you're continually transmitting, you don't want to have to ingest your local signal because it's always going to be the strongest thing you hear. And considering that it's going to be eventually fed into an analog to digital converter that has a fixed dynamic range, you don't want to blow out your receiver. You don't want to blow out the ADC. And so what you can do is you can actually mix out or null out your tone that you're transmitting just using analog RF mixes and blocks or DC blockers, for example. And that means that you take the power out, like the actual RF power you subtract, not digitally, but in the actual, you know, electromagnetic domain, and then all you're left with are the responses, the echoes. And that way, you can much better utilize the dynamic range of your ADC and see reflections further out. So from Wikipedia as well, this is a nice diagram to illustrate that kind of RF setup. So again, you've got your generator, power divider, transmit that out, and you split that off, bring in the output of your receiver through a preamplifier, mix out the signal that you transmitted, low pass filter, amplify again and into your ADC, and then you do your digital processing. So that's the simple kind of structure. So what I'm going to try and do now is take you through how you do range calculations. And then this is the magical new part, how to do Doppler calculations using FMCW and GRC. And it's basically all at FFTs. This appeared like this massive mystery black box to me for the longest time. When I finally kind of sat down and looked at it, it's actually really, really elegant. So my goal here today is to make you want to try to understand at least a little bit of this. So this is a plot in the frequency domain, right? We talked about chirps. We know that they look like this on the spectrum. Here you've got time going from left to right and frequency from bottom to top. And the red signal here is the chirp that you're transmitting. And the green signal is the echo. So remember, an echo is just going to send back what you transmitted with a delay in time, right? So as you can see, the green is delayed a little bit, Delta T, but from the original transmission. Now, what the beautiful thing about this is, as I mentioned, is that any time delay will produce a frequency shift at any single sample in time. So if we look at time T1, you've got a frequency shift from your transmitted tone at that particular point in time to your received echo. And what happens is, what was I going to demonstrate to you there? Maybe the whole thing here. Yeah, that's what I was going to show you. So going back to the audio demo, we had this, right? And we know what the VCO input looks like. We know what the VCO output looks like. And now we're going to look at the FFT of the output of the de-chirping of the mixing, right? So what this is doing is it's taking the chirp and then it's going to multiply that by what is being received from the microphone. And what you'll notice here is, if you compare this one, right, to this one, what it looks like, and hopefully you'll see this, it looks like the spectrum is being rotated over and over and over again at a fixed rate. See how it looks like the entire spectrum is shifting? So the left-hand side here, so actually, do people want a whistle? And you'll see your whistles come out, but there'll be diagonal lines here. So see, this is me. And this is the cool thing. Our chirp signal, remember I was telling you, would turn into a fixed tone? Can you all see the fixed tone here? It's that line coming down the middle at roughly DC, right? That is the tone that we want to receive. And any other tones that end up, I think in this case, to the left of it, will be echoes that have come back from around the room into the microphone on the laptop. And so that frequency difference, because remember this is in the frequency domain now, that frequency difference is going to give you your virtual range information, right? Which is pretty neat. Now what does this look like? I showed this last year, but hopefully explained things a little better so that it makes more sense. What I'm going to show you now is the same thing, but actually running in the proper mode. So this is the chirp again. It's running at a higher rate because we want to be able to update the display more. And you see when I'm talking, it's still picking me up, but because it's constantly rotating the spectrum, it doesn't look like a proper frequency response now. So what I'm going to do is get my other laptop to be the target. I'm going to set my delay slop so that you see the DC part of it somewhere. Okay, apparently it's not there. Nate Temple, the genius here pointed out that I'm trying to move my laptop in front of speakers that are actually outputting anything. I want to use the speakers in my laptop because that's where the microphone is, not those speakers. Thanks, Nate. Nate is an absolute legend, by the way. If you didn't get to see his cyber spectrum talk from the other night, I recommend you go online and watch it because he has built some amazing stuff. All right. Here we go. This is better now. So what I'm going to do is this is the DC component. And because I'm talking right, we're just going to get this noise showing up here. But if I put my laptop here, then you can see there's that main response there that is directly related to the height of the laptop. Don't clap, though, because then you'll offset the microphone. That's all right. I plan that. So what's happening here? A chirp is being sent out from the laptop. I mean, this is a chirp. Those air molecules are being vibrated. And then that audio wave is moving up, hitting the bottom side of my laptop and then being reflected back down the microphone. And amazingly, the microphone can discern, even though it's hearing itself, it can still discern the response from an object way out here. So that's the power of FMCW. Now, there was another little thing in there that was on my screen. I will show you that. That's actually the Doppler plot. And I'll come to that. What that gives you is not only range information, but it gives you velocity information. So you can tell how quickly a target is moving. And that's really cool when it comes to aircraft. So what I want to show you here now is this plot where, instead of looking at the original frequency domain, what we receive, this is the domain once we've done the de-chirping. So we have that DC component here. And then in the green line, this is now our target return tone, right? Because we've de-chirped it. And so you have this difference in frequency. So the frequency change implies that time delay and therefore virtual range. Now, what's neat is that you can take the de-chirped output and then pass it through an FFT, right? Because an FFT will give you the energy in each bin. And so the trick here is that you use an FFT. You know how you have a certain size in your FFT, a certain number of points that you transform? The number of points in your FFT, you can make the same number of samples as what is made up in a single chirp, right? So you decide how long your chirp is, and then you set your FFT in terms of samples to be exactly the same amount of time. And what that means is that every single time you do one full chirp, you take one FFT. And I'm trying to illustrate that here by these lines. So if you think about these over time, this is like a waterfall, these are your frequency bins that you get out of your FFT transform. And in this case, the echo return, the green line ends up falling into, in this case, arbitrary bin number five. And remember how we said that differences in frequency will imply your virtual range. Here, the bin that it falls into will give you the range information for that particular bin. So here we can say bin five, and we can calculate how far away that is. And to calculate how far away that is, you know how long one sample lasts because you set a sample rate on your SDR, right? So 10 megahertz or whatever, one megahertz. So you can calculate one over that to calculate how, well, the period of a single sample. And each bin in terms of duration maps up with the duration of a single sample. And if you know how long it is, if we're talking about RF now and not audio, you can multiply it by the speed of light. And that gives you your round trip time. And you can calculate your virtual range. So it's cool because you can map to frequency really easily just by doing an FFT. And then you can map that directly back to virtual range. So it's, does that kind of make sense so far? I see people nodding, which is good. Now the wild thing is in this, in this instance, you know, this is a targeted at a fixed range and it's not changing. Here, you might have two different targets moving toward and away from your radar receiver. And so you would expect a plot like this. And we kind of saw that before with the FM chip. Actually, one thing I was going to try was, I don't have another laptop, but maybe if somebody has a laptop, we can try it at the end. We can have two here. And then if you'd move them separately, you'd get obviously two different crossing returns like that. Maybe that's what I was going to demo, but I'll do that later because time is short. So again, as I was saying, the sample rate sets the duration of a single sample, and therefore limits your range resolution. Because if you consider you get discretized bins coming out of an FFT, and the length, the total length of that is going to be your unambiguous possible round trip time. And so your pulse repetition frequency sets your unambiguous range, but the sample rate also sets the distinct range that you can resolve within a single bin. So as you step from one bin to the next, that's going to be some amount of range that you go, you know, from one, one to the next. And so what you have to consider is that targets, depending upon your radar system, might be differently spaced, but they might fall within the range of a single bin. And so you'll get one return for multiple targets in a single bin, and that's ambiguous. So you want to do something which is Doppler processing to make that unambiguous. And as I was saying, you've got different radar geometries. So you've got a monostatic system where the transmitter and receiver are co-located, and so it's just a direct line of sight round trip time. But if you have a bi-static radar system where the receiver is separate from the transmitter, then you have to take into account the geometry because that delay is going to be different. You might have a target way out here and you'll have the signal bounce out and then come back or it might be in between the two and so on. So important to keep that in mind. So what I was talking about there is that with RF, the speed of light, we know what that is. And for a single sample, that's going to be a very large distance because we're talking about the speed of light, 300,000 kilometers per second. So even at a high sample rate, megahertz, a single sample is still going to allow light to travel a long way, or in this case, the RF energy. And so what we want to do is you can increase the sample rate to give you better range resolution, but there comes a point where if you're just using an SDR and a laptop, you can't go to gigahertz worth of bandwidth, you will end up being constrained by your bus and by your processing speed. So what you should consider then with an FFT here is that this is an FFT and you have these range bins and then every time the chirp finishes it does the transform and then it just starts and keeps cycling like that over and over. So you can see that cycle there. And we're talking about multiple targets fitting into a single range bin. And so once you do the FFT and you were to take the magnitude, you would basically get an energy response at that range. So as I was saying, you might have two echoes that fall into the same range because they're not the same target, they're slightly different, but from the point of view of your receiver, they end up giving you the same round trip time. And the other problem too is that you might have a radar system and I'll show you this with a passive one where the transmitter is so strong even out to many range bins that it completely swamps your receiver if you were just doing this transform. And what's nice is that you can do Doppler processing which reveals this hidden information working in the phase information that's output from the Fourier transform that will actually show you these targets even though ordinarily you wouldn't see them because you're swamped by your local transmitter or ground clutter. So yeah, the clutter could affect that bin so you don't see them if you're just looking at range information or it'll take out your entire transform. So the Doppler effect, we've heard about this and we know that it will cause a usually a shift in frequency. You know the classic example is an ambulance that's driving by you and in this case what I'm going to try to illustrate to you is how it changes the phase of the echo that comes back to the radar. That's the key here. So we get a phase change due to motion and again if anybody any any questions so far or any any comments if you want me to repeat anything please please ask don't be don't be afraid because it took me a while to figure this out yeah yeah so it depends on sort of how you conceptualize it and and how you receive it in this particular case is when you think about I mean I'll illustrate it here and hopefully it'll make sense. So the idea is with Doppler processing that you receive not just echoes from a single chirp but you do it from multiple chirps over some arbitrary period that you select that will also have implications on what you can resolve but this is referred to as the integration time so let's say you can have integration time like a second and let's say you're sending out a hundred pulses per second so you collect a hundred of those returns and then you process them in one batch and then from that you extract Doppler information and the key here and I'll illustrate this for you is that you instead of just ending where we had before where you do an FFT of your range response bins you build up then a hundred of those range response transforms right so you imagine they're in rows and then you run FFTs on the columns so what you end up doing is you do a second series of FFTs on each individual range over the integration period and I'll show you what that looks like so the other thing to do is remember again the geometry so this is Doppler velocity with respect to your radar system not with respect to the moving object so if you consider this scenario on the left here you've got a radar transmitter receiver sends out a pulse the airplane is moving tangentially at that moment to the radar receiver so it's not going to impart any Doppler shift on the return because there's no velocity component in the same plane as your radar signal whereas in the second scenario in the middle if the plane's moving toward the radar receiver it's going to impart the biggest Doppler shift on the return and if it's offset by some amount then even though it's you know not entirely tangential like in the first scenario it's still going to impart a little bit because there is still a component there that matches with the the plane so with Doppler processing if you consider the scenario where at any one time the target is moving toward the radar there in at t0 that pulse is sent out at round trip time divided by 2 the reflection is sent back and at the round trip time you get the return and you can see that throughout that period the signal amplitude decreases because of course there's that loss through through free space and and on the on the reflector and what I want to illustrate to you here is when that return comes back at a particular point in time right it's always going to come back at a particular point which implies a particular range right so there's a range there there's going to be a phase difference which might be zero there might be effectively no phase difference or phase difference or some phase difference between the signal that you sent out and the signal that comes back and if you consider the actual propagation of a sine wave even though you know their photons and whatnot then depending upon where that plane is if it's if it's here or it's like a meter over here offset the wave front will hit it and then you know if you imagine that wave front hitting it it'll hit it at a particular phase and then a reflect at that phase and if you move that plane a little bit then it'll hit the the tone will hit the plane at a slightly different flight phase and reflected a different phase and so in this particular case we're lucky because everything happens to line up on on the phase boundary and so when you get the response it actually ends up being the same phase so in here it's zero degrees and remember this is all in a single range bin you know you could always say if a plane is traveling you'll see it move through the range bins and imply and that implies that it's moving with some velocity but this is all within a single range bin because we want to disambiguate targets and get velocity information so you've got to transit a pulse receive pulse no phase difference now let's consider the plane has moved a meter and you know it also depends upon your wavelength that has implications too but here we're talking about the same scenario and if you if you notice carefully here the wave that is returned is oops slightly offset see that so see there and then there it's slightly different phase on that on that way and so when it comes back the phase difference is going to be slightly different as well and when we compare them we get some arbitrary phase offset now let's consider these are two distinct pulses now let's consider four in more or less successive time order so the first one comes back the second one comes back with it with a difference third one comes back in this case it's the same because the plane moved again and parted another phase change and then again we've got a different phase offset in the fourth period there and remember every chirp period that gets sent out we get this different energy coming back to the to the receiver it's the same chirp or the same signal the magnitude will look the same but the phase will be different and what you have to think about is when you do FFTs when we look at them on the waterfall or when we when we usually think about them what are you doing you're taking the FFT which gives you always a complex output and you're you always end up taking the magnitude of that complex output and the effect of that is anyone you lose data what do you lose you lose the phase information so here we want to use that phase information and the integration period is important because that's the period over which you're monitoring these phase changes in each of the range bins does that make sense so the idea is to get that phase information from each FFT bin over that integration period and then think about this what is a changing phase over time it's a rotating what's the magic p word it's a phase all right so you've got a phase going around and I think I have a blue arrow he gets so I mean just to illustrate that a little bit more very very simply in GNU radio because we usually use fast sample rates and fast in terms of spinning everything is high frequency here I've got a signal source which is a sine wave the frequency is 0.2 Hertz okay so that phaser is rotating very slowly and we can very easily illustrate that this is an IQ plot it's a scope plot in x y mode and you can see the samples you know this is a normal sine wave right where are you I get no it takes a long time because it's slow there it is let's go I think strip chart yeah there it is so we have the normal phase or the normal sine wave that's being generated but if you're looking on x y then it's just going to be a phase that's rotating at a fixed rate around your unit circle on the complex plane so what I want you to think about is exactly this in a single range bin okay so you got your echo coming back is changing in phase and so you'll get a bit of this in that range bin and of course what happens when you take the fft of a sine wave in this case you'll get a response in some bin that maps to the frequency of the thing so what does that imply can anybody tell me once you end up taking the ffts of the columns maybe i'll show you the the next slide and somebody can tell me so we did all that okay so we oh microphone stop working hello oh there we go tessie it's not even wireless so we couldn't have blamed somebody trying to dox my right so consider this then each row here is a single fft that we took on an echo that came back right over our integration period we build up multiple rows so these are all the transoms here we're just taking an eight point transform and so we build up each row and have a number of columns that maps to our integration period then to get the Doppler information what do you do you look at the face but but what's the next step here i mean i've already said it so i'm hoping somebody was listening you took a take ffts of what the columns so you build up this memory cell right and then once you've built up and you've populated all of these with the the complex outputs of the first fft part of the the range information you take the ffts column wise and so what you're doing is you're doing an fft to look at the phase changes of the returns at each range bin and what does that mean the Doppler shift and the movement of the target imparts the phase change and that causes that phase all to rotate and the faster or slower and whether it's forward or reversed rotating in that phase or tells you how fast the object is moving and so if you have two targets that end up ambiguously ending up in the same range bin but they're maybe flying in different directions or different speeds you can use Doppler processing to then disambiguate them so instead of then just having a 1d plot you are not with a 2d plot and i'll show you what that looks like so you run the ffts for each of these columns and then you get oops um now remember how i said remember this slide earlier in the presentation what was i showing you when i when i showed that slide an interleaver look familiar interleaver so what you can do in GNU radio is i just wrote a simple row column interleaver block and it takes the output of the first fft stage and then once it fills the interleaver it outputs everything into another fft and then you've got your all your your data the range and velocity information okay so interleaver and so this is a neat little tweak that i made to my interleaver usually with interleaver you need to fill up the entire interleaver and then you can read it out right so you fill up all the rows and then you read it out all the columns what i did with my interleaver is that you can set um i can't remember what the variable is but you can actually cause it to read out more quickly so it'll actually um fill up let's say there are eight rows instead of outputting all the data once it's filled up eight rows it fills four rows moves everything up fills up the last four rows moves everything up and then out all eight and then it fills the next four outputs all eight and then fills up four and so effectively you get twice the output coming out of your interleaver which means that even though you might have a long integration period like 10 seconds we don't want to be waiting 10 seconds to update a pretty plot we want to update it much more frequently and so you have your interleaver spit the data out more frequently so the D chip signal goes into an fft you get your range information goes into the interleaver you do another fft and then magically you get your uh velocity information so now what i'm going to show you is the original audio demo that i had with the with the laptop here and then that plot that i didn't show you is going to be the output of all of this in a 2d image that gives you the the Doppler information okay where are we audio and i will take Nate's wonderful advice again use my laptop speaker the microphone uh let me just fix up the uh so c before the um this is the dc components actually wrapped around because the microphone on the audio card starts sampling at a different time to when the trans the audio speak you know the output visually allen converter started they're offset in in frequency space because they're offset in time and so i need to adjust for that manually myself and just do that right there's that it should look familiar to you and now let me show you the the magic here hopefully it'll work all right here we go so this is the Doppler plot uh you're gonna see a little bit more of this because i'm i'm just going to show you some brief previous experiments after this what i want you to think about here is that this is the waterfall behind it is showing you effectively virtual range over time right the Doppler plot the vertical axis here is the range information the horizontal axis is the velocity information so um stuff there actually you see as i'm moving around i'm actually causing the echo to change so it surprisingly sensitive um so yeah the see how we've got the strong dc part there on the left that maps to vertically that strong bright spot at the top there so what i'm going to do is i might need to change the color mapping here yeah that's better all right let's see how this works so remember at what you'll see is i'm going to be moving the target at a particular velocity you know maybe up and down a little bit slower a little bit faster and you'll see a bright spot up here there either on the left or on the right depending if i'm going forward or backward and the distance from the center point is going to imply how quickly i'm moving it let's see if this works so there's my target it's a bright spot right and notice that i moved it here and now it's settled so i'm going to move it more see now it's on the left because i'm moving it up and i'm going to stop and it's going to come back because i've stopped moving it now you still get the range return see the bright spot in the middle in the center line but i'm not moving it so there's no doubt why i'm actually because i can't keep it totally steady but it's that sensitive that even as it's waving here you can see it's moving left and right so if i go like this then you get the doppler response see and that's just using the audio now who can apart from the multiple echoes who can tell me why we're getting multiple echoes here wait skylar hang on a second anyone anyone else with a hand go out the back say say anyway yeah because at least because we didn't have a microphone but what the two gentlemen were saying was that they're so close that the audio is going to bounce off the bottom of my laptop it's going to enter the microphone but it's also going to bounce off my bottom laptop and it'll keep bouncing back and forth so we get the multiple returns there the strongest one's going to be the direct you know the the shortest part the other thing is look carefully what happens to the bright spot in relation to the edges of this image as i move it right i'm going to move it very slowly to keep it in there maybe a bit further out so we don't get all the returns right so now it's going to be on the left i'm going to move it fast and watch what happens who said that that's right it's aliasing why we're getting aliasing yeah that that's exactly right so what is happening here is that the pulse repetition frequency right how often we send out pulses and that that'll basically constrain us in terms of the maximum unambiguous velocity that you can make out so once you exceed that velocity just like with a normal signal if it's above your nyquist divided by two or nyquist if it's complex it will then wrap around because you know you can't disambiguate above that it's exactly the same principle so this is another constraint that's applied to the radar system depending upon the parameters that you pick initially so you have to as a radar designer define what your you want your maximum unambiguous range to be and then you have to balance that with your maximum unambiguous velocity because you can't have both it's got to be either one way or the other yeah so that's that's basically the i need the sound on again very good advice from the audience uh so any any questions about this no so just a reminder then remember how you were we were basically doing the the interleaving and doing the fft that plot that we saw is basically the output of that entire process so each of those uh each of the rows was an output of an fft uh so that's the audio um it's almost time so if you're interested i'll show you a an sdr version um i've got some directional antennas here and um i'll try and show you because i've actually what i've done is i've i take the uh the bins that map to to the zero range because remember i was saying light travels really fast right so even though i'm think i'm using like one mega megahertz sample rate and so one over one million is still going to be like 100 meters or 60 meters or whatever it is um and so we wouldn't be able to demonstrate a change in range but we can demonstrate doppler because doppler will still work in the zero bin once if we're nice and close and what i do is i i then do an inverse transform of that fft at the zero bin and then i put that to the speakers and it makes like a theremin so you can we can wave something back and forth and and turn it into a musical instrument but i'll show you that last in case we run out of time you can stick around if you want to see it right back to the slides okay so that's that's stuff that you can do at home and i want to show you some stuff where it's used in practice and how you can actually decode that stuff as well so one is coder which is hf radar used to map the surface of the ocean and map currents and you can go on various university's websites that have these set up and look at interactive live um plots of of currents of the ocean the idea is that you send out an hf wave it'll hit crests of an rf wave it'll hit the crests of ocean waves which will reflect them back to the receiver and those ocean waves will also have a velocity component to them which will then change the phase and if you once you do doppler integration over time then you can both get range and velocity information of ocean waves and then you know they have them distributed in certain certain ways and they do something interesting which is gating the the transmitter so if you look on an ordinary plot it looks like it's just a chirp because if you look very closely with the right parameters you can actually see they gate the transmitter on and off to give it that dead time to receive a return um and the side effect of this is that when you take an fft you actually end up getting sidebands on your main signal and you might think they're returns but they're they're just a result of doing the frequency transform of this gated waveform and if you do the math they're like millions of kilometers away in terms of range and so the stuff that you want to see is really a really close in um so that's what it looks like and you've got your main chirp and return hidden in the middle and those symmetric lines going out um are the am sidebands that you can forget about so I showed this um before there are a bunch of hf coder stations running on on the spectrum all the time so you can pick any one of them i've basically talked about all that uh and a shout out again to peter bellings and moe wheatley iris space spectra view actually peter made these vivoli antennas really cool if you want to check them out after the talk uh and they've used um spectra view and and his sdrs to produce these amazing plots where you're not actually looking at the ocean returns anymore you're looking at the returns of the ionosphere because it's hf and it'll trans still uh travel all the way up to the ionosphere and and come back on a return and then um this is one of rf spaces tweets um really really cool stuff and then this is in spectra view producing on the plot so in this way you can use the signal in an unintended way to look at natural phenomena that are going around uh the earth um and another very very amazing sdr guy yuhar varian varian then he's done incredible stuff he's used um this kind of u hf radar to map the the surface of the moon amazing paper um he's done something really cool stuff with ionosongs um i tried my hand at this and and so you also get those multiple returns um this is sort of frequency over time as an ionosong sweeps the entire hf spectrum you can similarly deep chirp it right because it's just a chirp and then get these multiple reflections from the ionosphere um we went out hiking and i thought after the hike um my wife was was very kind and she just read her book in the car i set my laptop up in the boot and received uh codar from around that area and i you may recall me showing this previously but this is the x axis is time y axis is frequency and so this is the codar over time and you can see the am side bands again and you know there's some information hidden in there we're only interested in the center one if you zoom in um in this case i don't think i had a gps do the gps discipline so you want to have a gps too so that the phase information is correct because remember if you're running your interleaver and the doppel processing you want to make sure everything is perfectly aligned and stable otherwise you'll be introducing phase shifts they'll actually corrupt your velocity of making so there's something going on there i didn't know where it was um again you know there's that's range information effectively in the middle so you're getting returns somehow uh and then with the in buckley we went out one night and set up and did some captures with a gps do um and you know we had a long while sit up and a custom made dipole i'm going to show you a video here this is again same kind of output from the new radio using that same plotting thing but it's it's a modified sdl sync but also outputs a series of bitmaps um and so this is the same everything that i've spoken about so far exactly the same process but applied to the hf codar returns that i i just received on my sdr now the disclaimer is i have no idea exactly what we're looking at i mean obviously there's range and velocity information in there and you can see that there's some cool stuff happening like there are these you know there are these returns here um and in terms of you know velocity which is on the y-axis things are moving in a directional way and toward uh the receiver so there's stuff happening and there are weaker responses out there i don't know whether this is purely from the ocean waves or it's purely from the honestly i meant to do the calculation but i didn't get a chance if you know what's contributing this to this um i'd be really curious um but you know you you can basically take the parameters of the radar system and calculate how far those returns are and what the implications are there and you know it's this information that they take in the codar system and then you know map to figuring out the ocean currents from information gathered from multiple points like you saw on that map before so you know you can generate cool the last thing i want to show you is the television version with aeroplanes so this is using passive radar again so with codar i was using someone else's signal in this case i'm using digital atsc terrestrial tv signal and remember like i said with the chirp if you have a fixed thing that you're looking for you want to run through your filter in the case of atsc happily in the middle between picture information there is this synchronization sequence of a fixed length and it's documented so this now becomes the filter no not fmcw chirp anymore but in the standards based field uh synchronization sequence for atsc that we end up filtering and correlating on and this is in all all tv signals and what's nice is that they use incredibly powerful transmitters that'll you know broadcast the signal of a broad period time a broad range i had this setup first in my house where i had this directional antenna pointing out to a highway this is the view from from our place there's the 280 highway in san francisco that runs there cars are moving back and forth they might end up being decent radar returns and so you can see how they're moving there one direction of traffic on top the other on the bottom you know they're metallic so we might get some returns and that's that's the receiver site the transmitter site is over the hill which is nice because we don't get the direct path from the transmitter you get the reflections from stuff that's and and the direct path is attenuated so this is where i want to illustrate the power of doppler processing because in this case doppler and velocity is on the y and ranges on the x what this is doing is it's correlating that known synchronization sequence in the signal and then that initial the strongest return from the filter for that ends up being in the center there right so you can ignore everything to the left because that's all picture information and then stuff that's here this is basically time zero in terms of the synchronization signal now if we weren't doing any doppler processing if we didn't care about velocity and we only got the range information look at this range line the green line there's nothing discernible going on there you will get reflections of static structures that are large and will reflect a lot of the signal but if you're talking about cars and airplanes the radar cross section is so small they won't be able to send any appreciable signal back to that small directional antenna so there's nothing there once you add velocity information you can see that greener you know we go from white to red to green you know it's the rojibu color spectrum the black and the blue there are basically down the other end of your your your power range still within the dynamic range of your adc but it's much more easy to discern stuff so actually that's a target right there i'll go back to the beginning because i think there was oh look did you see no no that's not on my display let me try that one more time so the beginning here's some funky stuff happens is the interleaver fills up because there's still empty data in the interleaver and it kind of settles um just and just look here and you'll see little little spots appear and i'm pretty sure their cars on the highway that are reflecting the tv signal there's another one there and then they appear very briefly because if you think of them like a mirror you want to end up seeing the specular reflection off the car and the car has to be at the right angle you know the door or the windshield or whatever it is has to be at the right angle both with the respect of my receiver and the tv transmitter to produce the reflection another important point here is that i have a gpsdo displaying the oscillator in my receiver but that's still not good enough there is still some phase offset so there's an additional series of blocks i have that measure the number of samples between each successive primary return of that atc synchronization sequence and there's a polyphase resampler but it uses a long double ratio which is it's not a float it's not a double it's a long double because there are so many decimal points back where it has to adjust the rate to be locked on exactly to the signal and not have the range information wander and then you have to track the phase of the primary and it gets a gets a bit messy if you want to do it well but you can see there's there's stuff going on in there uh now to prove this a little bit more with the aeroplanes um Ian and I went out to this car park and we parked there and this is in this this park here which happens to be here right under almost the approach path for planes coming to land at san francisco so the geometry there is really nice we've got a tv transmitter uh elsewhere in the north of the bay and then we're down here receiving the reflections quite close to the aircraft and that's the kind of setup there we're just using a normal tv antenna plug into the user um and then that's the the stuff going on there um so i've got a couple of videos and we'll see some interesting things so that's a plane see that and it's aliasing because we can't control the pulse repetition frequency and so we can't control the the constraints of unambiguous velocity because that's defined by the atc spec but you can see these returns that are coming off planes and what's interesting is um in terms of range these ones are actually further out we wouldn't see them again right if we didn't do the Doppler processing but because we do we basically ignore the ignore the zero velocity line and these targets are moving so they make themselves apparent on the rest of the plot uh let me go to the next one some of them are more obvious than others i think they get yeah so you can see see there that line and the reason why it's strict like that remember i'm reading out of my intellect are faster than the integration period so you basically end up sort of smushing together smoothing together one integration period with the next and so you'll basically get the blending of velocity information as you go sky the question i haven't quite got my head around that um it's some some artifact of the the system oh look there's something there um yeah i have to think about that that a bit more it wasn't immediately obvious to me why that's there any ideas from anyone yeah so there's a return down there so some of them are really obvious and what was interesting is that the planes that are overhead are obviously gonna be very close i think one pixel here is 30 meters somewhere around there or maybe 40 meters um and so they'll end up being quite close especially if they're flying in overhead but what what really blew my mind later is that i can't remember which of these videos it is unfortunately it might just appear but there are dots that appeared further out and they stayed there with slowly changing velocity and as it turned out we looked over to the bay and of course the san francisco but what's on the other side of the bay oakland is on the other side of the bay and there's a plane that was just smoothly cruising into land at oakland but because of the the geometry it was headed in a slightly different direction than our um planes coming into sfo and so it appeared differently on the on the plot both in terms of velocity and range oh yeah so there's something up there yeah so any any questions so far so you saw that that streak there um i think this one might be a decent one question correct yeah so if you're transmitting or saving from the same piece of hardware everything's inherently going to be synchronized so you don't need to worry about using a gpsdo um and i'll show you that if you want to stick around the you mean that vertical line there so i think at the moment because there's a lot of artifacts happen during that discipline disciplining process not the gpsdo one it's the other one that i talked about where it measures the number of samples between two of the the primary returns and the measures the phase there's some additional processing going on there and as it's doing that very very fine adjustment of the resampling it produces all these artifacts in the in the ffts and that's why you get crap maybe question so i don't remember the exact frequency of this television station but it's you know there are plethora of stations throughout the the uhf band that and then you know based on the frequency that has an implication on how the radar cross section will look so you know everything is is interrelated there and he were constrained obviously by the station frequency in the spec so anyway that brings me to the end of the talk if you want to hang around i'll show you the sdr version if you want to hang around now i'll show you it'll just take a couple of minutes or you can clap now and head to lunch totally up to you all right i'll do it quickly so here we've got oh one thing i want to show you quickly this is the live processing of the of the coder right so the hf hf one so this should look look familiar from the video what i wanted to show you here is in this plot you know how it's talking about using when we take ffts we always take the magnitude and we always forget the phase information this plot is showing you phase information only so consider it like a waterfall but instead of showing the magnitude it's phase and remember how we had those interesting shapes coming up and you know there are strong returns so there's going to be some information there this looks like noise because there's you know there's not nothing strong there but amazingly you can see this phase structure in these range bins in the in the return so i you know it's i haven't i'd never looked at a phase plot like that before phase waterfall and it kind of looks cool so let me find the i think it's this one so this is two mega samples per second oh let me show you that that workbook all right here it is so this is a jupiter notebook you basically put in the speed of light put in your frequency put in your bandwidth which is effective your sample rate and then it'll calculate range resolution you're putting your pulse repetition frequency you know you basically can plug in any anything you want into the equation and then it'll output all this information so for a given frequency gives you the wavelength the range resolution for a given pulse repetition frequency give you pulse duration given bandwidth will give you an ambiguous range unambiguous Doppler unambiguous velocity and then you know you can you can figure out the scales on all of these pictures from from that see what's happening here oh it's running cool so what i've got is i've got this flow graph here and then i've got the plot here nothing's showing which is a little bit it oh no of course nothing's showing skyler correctly points out thank you another audience contribution that the default configuration here is everything's off because i don't want to blow anything up so and i've got these two directional antennas here one is the transmitter one is the receiver so i can put them next to each other and i'm just gonna do a very quick set up and put my bag in between and hopefully that won't affect things and i'm using a b200 mini here in a custom 3d printed case and um let me try doing this this is really something like that okay and now no guarantees um it works also better if you're using separate transmit receive usurp so for example i did it with a usurp with a mimo cable so you can synchronize the clocks but the actual boards are different and they won't interfere with them with themselves as easily as they do with with a single unit so what i'm going to do is i'm going to set the gain i think i use about 55 save gain i'll go 33 and then i'm going to set my sorry i have to bring it over here my amplitude of the signal of 0.7 and what we've got now is hopefully this is familiar to you the spectrum is being rotated so this is the fft of the output of the d chirping and we've got the tone there which is what the um the system is hearing it's it's the dc tone it's it's it's hearing itself basically unfortunately that's really high and it shouldn't be that high because we won't be able to resolve anything otherwise it might be too much here oh there it is see see how we've got the center line which is actually offset uh i haven't figured out why that's offset yet but as i move my laptop toward and away from the receiver you can see that even though it's in the zero range and it's still overloading you still get velocity information um and so that's you know that's basically how rf radars work now let's see if um because i promised you music music is that actually yeah let me uh plug in the speaker there so it's a little bit finicky in that um i need to do that and then i need to change the offset here so it ends up going into the zero bin because that's the one that's being transformed do i have the audio actually on thank you another audience participation nate what would i do without your