 All right. Thanks. Thanks for bearing with me and thanks for all. Thanks for all turning up. Everyone enjoying DEF CON so far? All right. Well, my name is Balint Siva. I'm the Director of Vulnerability Research at Bastille and I like SDR. SDR is pretty cool. I thought today I'd share some more tips and tricks with you. I've got a couple of different projects that I've been working on for some of the many years and I thought I might just take you through some flow graphs, how to build them up. We'll leave that one to later. We'll build up to that one. Oh, and of course, don't forget that here's the first tip. You might appreciate this for this one. Oh, what the hell did I just do? X Quartz with multi-monitor is crazy. There we go. So tip tip zero, if you're using X Quartz on OSX, who uses a Mac, by the way, in Linux and Windows? If you're on OSX, OSX is really good. You can use Homebrew and Mac ports to set up running an older version. But there's this button here. I think it's Control-B, right? Control-B. Some other stuff. But I was just updating the InMarsat stuff so you might like that. Again, although this is pretty experimental and it's just mainly so that people can get an appreciation of what you can do with radio. You might remember this. I've spoken about this a couple of times. This was a system that I put together using Gunned Radio back in Sydney. It would track all the planes, everyone. Who's tracked planes with ADSB before? Yeah, it's quite a bit of fun. And the software that sort of drives this aviation mapper, it's just been collecting dust, so I'm going to open source it when I get home. I've added support for all the different decoders so you can do cool stuff like this. But what I want to demonstrate here is that apart from the trails of the planes, you have these dots. And so these dots actually represent A-cars messages. Who's read A-cars messages before? Aircraft, communication, reporting system. So a lot of these short text messages like messages or transmissions are sent to and from aircraft. They contain performance reports, human readable text messages to the cockpit, information that carries waypoints of the planned flight routes and so on. And this is really cool because it's VHF, so it propagates a long way. And you can easily receive these on the ground. You can go to your airport and see all the different messages that are going on. But there are areas around the world where you can't receive these terrestrial messages, like over the ocean. But transatlantic flights, for example, are trans-Pacific. And there are two other systems that are used there. One is using HF, so on the order of 10 to 20 megahertz. And they send very short messages over those frequencies and they obviously propagate a long way because the wavelength is very long and you can get quite good distances with HF. But the other one is SATCOM. And SATCOM is, of course, enabled by the fact that you have satellites that orbit the Earth that allow you to relay messages. So all sorts of interesting stuff you can see. So can anybody tell me what we're looking at here in terms of this diagram? What is this diagram portraying given what I've just told you? The satellite footprint. Yeah, that's exactly right. So this is the portion of the Earth that this particular satellite sees that we're going to look at. And there's a nice document the FCC has, you can look that up there. And this is a sort of artist's rendition of InMarsat 3 geostationary satellite. InMarsat is a company that has a constellation of satellites and they offer a whole different bunch of services to different customers in different sectors. The one that we're going to look at in particular is for aviation. And they have these geostationary birds placed around the Earth so they have really good coverage of the entire Earth. And they're orbits of nearly 36,000 kilometers above the surface of the Earth. So obviously geostationary, so they spin at the same rate as the Earth spins and so they're always at the same point in the sky. So the one that we're going to look at is 3E. Oh, I can't see it. 3F3 maybe? Anyway. What's that? So how does it actually work? Well, in these instances, this is just a dumb satellite. So the satellite is a linear transponder. It essentially just receives messages sent from Earth ground station on one frequency and then sends it back down, relays back down on another one. And anything coming back up from another frequency will get relayed back down to the ground station. So the objects of interest here are commercial airliners. If you look at them, you can see on the top one of those sort of bumps is usually the SATCOM antenna. And you can make phone calls. And it's also used to convey these text messages and so on so that pilots can maintain communications with home base and air traffic control and so on. There are these pretty much two large companies that aren't considered that actually roll out and deploy and manage aviation networks around the world. The ones that obviously use ACARS, they have their VHF data link set up. And they also have these internet working agreements. So it's like a different sort of large scale network that planes and operators and so on can use some transfer messages, I guess much like the internet. So the ground station to SAT link is C-band and the L-band is the link to the plane. So let's have a look at what the bent pipe transponder architecture usually looks like. You've got your L-band antenna here, which operates in this frequency. This frequency is nice because you can use pretty much every SDR to receive it. And this is what is sent down to planes and what the planes communicate back up to the satellite with. So we can, as it turns out, pretty easily listen to the downlink to the planes. And I'll show you what sort of antenna you need to use to do that. And then on the other end, if a plane sends something up, it downlinks it back to the ground station on three and six gig, I can't remember which one's which, but that's on the C-band. So if you want to use that, then you need a slightly more sophisticated setup. So we haven't got to that point yet. But essentially, if you look at the architecture here, it's basically receiving stuff, doing some processing, going in beamformer, coming back down, and then the reverse process on the other end. So this is what you need to actually receive the downlink. My friend and colleague, Ian Buckley, has helped a lot with this and he actually found this old dish and he created this feed, custom feed. It's a left hand, circularly polarized helical feed. So if you look at it, it's pretty much as bent wire that rotates in this particular circular polarization, which is the polarization that's used on, offered in satellite links. And then that goes to a bandpass filter. So you filter out all the other noise. And then there's also an LNA there to give you a bit of a boost in the signal. And essentially, you can calculate roughly where the bird is in the sky, set the dish up, have a little spectrum analyzer running, FFT, what have you, and then adjust the pointing of your dish until you get good peaks on your spectrum. And then you're good to go. So there's only one LNA out there now. So it's actually relatively easy to do. So why is this service interesting? Well, as it turns out, there are a lot of different channels on it. And I should put out there that this is just one stab at it. Other people have also done different implementations and looked at differently, some really nice stuff out there. But I just want to share this with you because I've sort of attacked it from the Gini radio side. And if you want to build on it at some point, then that would be great. But everything begins with the P channel. And that's where the coordination and timing is set for the system. It's this time division multiplex packet mode signaling. And if we're listening to the information that's addressed to all the aircraft, they're contained in little packets called signaling units. And it's basically just a constant stream of these on, I think there were nine P channels that are available there. There are the channels, then the R channel, T channel, C channel that are used for random access signaling from the aircraft. So if you imagine, kind of like cellular networks now, there's a random access channel with your phone wants to get the attention of the network. Once you actually coordinate and set up a reservation on the T channel, you have TDMA, so that's from the aircraft for data transmissions. And then you've got the circuit mode C channel, which is something that carries a voice and also use a data. So once you actually receive it and synchronize to the P channel, all of the other channels synchronize in terms of timing to the P channel. And then, you know, once you go through the layers and end up decoding more of the protocol, then you can get to the point where you can actually follow the establishment of calls and then decode that data and put it through a vocoder and maybe get voice. I've not looked at that part of it, but that's not outside the realm of possibility. So that's what it used to look like. I cleaned it up a little bit. And I made it even cleaner. And believe it or not, for those of you that know the being joke, I use hierarchical blocks. So I'll go through and show you that shortly, but I'll just go through the system, the theory and the process of breaking it down and then I'll step through the flow graph. So this is the old version. What we did initially was just record the spectrum for a while and then do offline processing. So remember, we only want to look at the P channel. And what you can see here is the P channels that are actually available coming down on the different, I think they're on the different spot beams. But because it's a satellite link, obviously your signal to noise ratio isn't fantastic, but luckily it's good enough so that you can use the simple blocks and demodulator and clock synchronization. You're going to be ready to do it. You don't need to do anything particularly fancy at this link budget anyway. So the first thing obviously is you want to select the channel that you want to listen to. There are actually a bunch of P channels as it turns out. In this case, we're just going to decode one. And you want to basically tell your channel iser to move your baseband spectrum to that point and then, you know, filter out just that one channel. Narrow, you get the better. But as you will see, one of the common things you have to deal with a real world radio system is clock drift. So the, you know, oscillator that's originally being used to synthesize the signal is going to be drifting in relation to the synthesizer on your own SDR, which means that when you tune to the particular frequency you might know this is at, it might be offset. Or even if you manage to zero it down completely, it'll move back and forth. And that could be due to a whole bunch of different factors. But usually you want to implement some sort of a scheme to keep a lock on it. And I'll show you how to do that. This is one way to do it. As it turns out, the signal is actually GMSK, Gaussian Minimum Shift King. So it's essentially two level frequency shift king with a pulse shape filter on it. And I'll show you what this looks like in board line. Who's used board line before, by the way? A couple of you. I'll show you. It's a really cool tool. It's one of the most invaluable tools for doing offline and online signal analysis. I can demonstrate that to you. But one nice property about GMSK is that FSK is that if you square the signal, like you literally take a complex sample and you square it, and you take the FFT, you get these two peaks coming out. And these peaks, as it turns out, separated in terms of hertz by what your board rate is. So immediately from this, you might guess that this is how fast are the symbols coming down, if it's the separation between these two. Okay, I'll come back to it. But the point is, as this moves around, you want to be tuned to exactly the midpoint here. And so this green line is showing where the tracker is locked onto that line. That was the first primitive stab had a problem that I'll show you why. So the second thing is you want to identify how quickly the symbols are coming down. Instead of looking at that way, what you can do is you can take the sample, and then multiply it by another stream, which is a delayed version. So basically you can just delay by one sample. So you basically multiply your current sample by the previous sample. And then you take the FFT of that. And then your first big peak is your symbol rate. That's telling you what the periodic component to your symbol is. And this works with a whole bunch of different modulations, primitive ones. But this is here. So what's our board rate? 600 bits per second. Symbols per second. Yeah, that's it. So it's just a really convenient way. I was coming at this as I usually try to do from sort of what's called the blind signal analysis perspective. So you don't actually know what the properties of your signal are. And you want to use some DSP tricks like this to figure out what's going on. So the next step then is you've got your raw symbols. And I've oversampled here. But you want to turn those samples into discrete symbols that you can then turn into a bit. So one neat visualization you can do is with samples, instead of drawing lines, you can draw them as dots. And so this is where I haven't classified them as bits yet. We're purely just looking at the samples coming in on our channel. And you can see that they're sort of scattered all throughout between 1 and negative 1. So that means we've got them in the right range. We're essentially doing FMD modulation on this and then adjusting the deviation appropriately or the gain on that so it falls between these normalized values. But we want to turn them into a symbol stream. And a really easy way to do that is just to slice around the middle. But to slice around the middle, we want to have clock recovery going on so that we have a block that will lock on and say, oh, I'm going to sample here. Then I'm going to go forward a couple of samples, sample here and sample and keep sampling like that. And make sure that we get really good separation between our symbol streams so that we can turn them into bits. So I think I may have oversampled by 4 here or something. So there are 4 times as many dots here. So you're seeing the intermediary dots between each symbol. And here it's tracking it and just picking out the 1 symbol that represents a bit. And you can see that it's actually doing that really well because there's this big gap between the bottom line and the top line. So then we can just simply say, right, draw a line through the middle across zero, anything above that's going to be a 1 and anything below that's going to be a 0. And then that's a really basic way of doing it. There are far more sophisticated ways of doing it if you know things about your modulation or you've got a really low signal and so on, but this is just the simple approach. And then the next step is you want to actually visualize your bits to see if there's some sort of structure to them. There's some sort of a pattern. And then you can use that to further deconstruct your and de-frame your packets. So there's a cool roster plot. I'll show you how to do that. Actually in this flow graph I have it going out a UDP socket. And then I've just got a little flow graph that's listening on that and then plotting all the data that comes out. And then usually when there is some structure like here, if you're looking for the scan line length or your fixed frame length so that they all pile up on top of one or another and then you're synchronized, then you'll get these as straight lines. If you're offset then they're going to be diagonal and you just need to sort of adjust your length until it looks good. And once you do that you can see here that the vast majority of this packet or each scan line looks like it's random, right? Just looks like a mixture of ones and zeros. No structure to it. However, in here there is this very repetitive series of bits which is why you get these vertical lines. So each one of these scan lines has the same series of bits here. And then there's some sort of a counter or some marker happening here as well. So that would be indicative of a synchronization field to begin decoding that packet. So it appears random and then when you often deal with links and satellite links in this case there are often three measures that are employed to add more redundancy and allow you to decode your signal and your packets even in the presence of errors that might have corrupted your data. So the first thing is interleaving and this will protect against burst errors. I'll show you why. Forward error correction so that adds a little bit more data so that you can recover the data if bits are corrupted or flipped. And scrambling that's important because if you have a long sequence of the same bits or long sequence of ones or zeros there aren't changes to which the receiver can train its clock recovery. So you want to have as many flip bits as possible so that there are these back here these excursions to either end as often as possible. If you don't do it as often as possible then you may lose synchronization. And there are scrambling codes fixed scrambling codes that use a polynomial in a shift register and then you can actually deterministically figure out what that is. So you can go that far I found a couple of hints which is what that's about but the hints are that a frame payload consists of multiple fixed signaling units and for the p-channel it usually depends on the on the data rate but here it's six su's and each of them are 96 bits. So for transmission you basically construct the entire group you scramble it use a half rate convolutional encoder then you put it through indeliver. So what does that actually look like? You have your six 96 bit signaling units here you have a scrambler so you generate your scrambling sequence you take all the bits and you excel them together once you've got that you put them through convolutional encoder I'll show what it looks like and then you interleave them and create your payload now and that fixed set of lines that we saw before that's your unique word and the frame header and that's what's used to demarcate then the payload and these just run back to back the entire time on each one of those channels. So what is an interleaver or how does the deinterleaver actually work? In this case you can consider an interleaver as a memory cell and you write into the interleaver in one particular order and you read out of it in another order and what that does is it distributes temporarily the bits so we've got n rows and m columns and in terms of writing in we're going to write row wires like this so we write across each row and once the interleaver is full then we read out column wires so instead of going in the same way we go one five nine thirteen and so on like this until we've read out each of the columns and what we've effectively done there is instead of having one and two next to one another we've got one and five so we've redistributed them temporarily because remember we've also got forward error correction on this if we affect some of these bits then if we were to read it out if we weren't interleaving and we just had them serially in order if we had a burst error that affected some of these bits it might affect too many in that one small group for the error corrector to actually correct fully because there's only so far that an error corrector can go given the amount of additional bits that you're putting in there however if you put them if you distribute these bits temporarily so you they're not in sequence anymore but they're temporarily distributed like that then if we end up affecting these three bits one five nine then we can still rely on the error correcting data that might be in this smaller group around one to fix one and then also fix five and then also fix nine so it's just a really effective way of doing it it's pretty much used in every single modern communication system even on cds for example because you know you can take a drill right and drill a hole in your cd and we'll still play back it's because they use a hell lot of this and other error correcting codes and then the next step is a convolutional encoder this is used in all i mean this has been superseded now by much more modern codes like turbo coding that's used in lte but this was used for a very long time and still is used a lot of the the nasa space probes used to use this and so on um but what you do is you feed your data bits into a buffer and then you basically take these taps off and produce extra bits for the incoming bits uh and then that adds the redundancy so what happens is depending on the polynomial that you've used how you've configured this then you can generate this trellis it's called uh i won't go into details because it's a bit outside the scope here but you can use a decoder then and as you fill in your received symbols the decoder will try and find the most likely path to traverse through the trellis that matched the symbols that you actually received so although you might receive some symbols in error that won't exactly match what's in these elements here it will still find the most likely path because there are going to be paths that are more likely than others and then it'll use that as the data that you actually received and in that way you can then correct or recover the bits that might have been corrupted it's a really really neat system one problem with this approach is that if you implement it on a pc because processes although there is parallelism um you know you've only got a certain number of cores this actually is highly parallelizable so on fpga's you can pretty much do the entire thing one step because you can have like 256 or really actually more uh branches and compute it really quickly but on a on a serial machine you actually have to traverse all the more um and so that can be quite computationally expensive for very large um very large values of k which defines how long your your constraint length how long the trellis is um but what's nice is ganu radio has blocks to do all this kind of stuff already and uh in gr feck which is a sub module in the the main tree there's this um nice extended decoder object and then you can configure that with this definition and as it turns out they actually use the original configuration that was on the voyager space probe so that's a convolution code of k7 and these two polynomials and you put that in there and magically things start working the next step is that um they use scrambling so you can set up your own scrambler an additive scrambler there's a different sort of scrambler called a multiplicative scrambler but here we use additive and we implement that as a linear feedback shift register uh the one that's already in good new radio didn't quite do i wanted to do wasn't it had to read out the bits before or after the opposite one um so i just modified that into this new one and then you can use this mask and the seed to initialize it and every time you reach a frame boundary it just spits out your scrambling sequence and it descrambles it so when you actually do that remember that really noisy plot that we had before now it looks like it has far more structure and it's repeating can you see that it's not that random mess anymore so that's really good that means that we're on the right track the last thing in a communication system is often adding a checksum on the end of your frame so once you've decoded it you can verify that the data that you've received and decoded is actually correct and free of errors so in this case they're using crc16 ccit which is one particular variant that's just on the end of the payload there and when you run it you should get this value so once you do that the program will start spitting out these su so this is 96 bits worth plus the crc on the end and these just come out in order one after another and well what are they as it turns out they mean a whole bunch of different things but what i was particularly interested in is there's a certain combination of bits in the upper significant most significant bits of the first byte that determines what sort of message it is and in particular this user data message so what they do is it can be of arbitrary length and cross multiple signaling units and there's a sequence number that's indicated by the rest of the bits in the message and as you can see it counts down so this indicates that we're starting an isu and then it's d6543210 and on the end of it once you've been accumulating those bytes you can then turn that into a message and there's a whole bunch of other stuff going on here remember this is the coordination channel so you see you know logons and acknowledgments and information about the constellation and system time and so on so once you accumulate that in this case it was going on for longer and it finally ends when you get to see zero then you can turn that into actually as it turns out a message that looks an awful lot like the sort of ACARS messages that we see on the ground so in this case you know this has this weird formatting but please I actually see this often please arrive at the boarding gate at least however many minutes before departure late passengers may not be accepted I thought they weren't generally accepted but you also see from all around the world actually I was looking this morning was seeing them from Hong Kong and I think Singapore notices regarding those particular airports and weather reports so these are all being downlinked to planes that are probably en route to those airports and they want to know what the weather conditions are and instead of you know calling someone up they can just have the you know request weather to be sent over in Marsat to the cockpit and so this is saying that you know there's a taxiway k that's closed between runway 33 and taxiway j blah blah blah and then you have these METAR weather reports so if you actually look online you can figure out what it means they refer to visibility and various you know that's overcast and various other aviation acronyms and this is where it gets really interesting with CPDLC and AFN these are a layer on top of the messaging protocol and again it looks like some odd things but you know here you've got some sort of obvious encoding this is another ADS protocol and again that'll be that when you know one of the next steps to look at but this actually relates to airline sort of logging on and off these various networks and various updates as they fly through the sky and then you know scheduling and often you see all sorts of free text messages too or it was sending various names and so on of the flight crew so let's actually have a look then what the flow graph looks like so not that one let's start with this one this is the original one this is testing one one two so one one other nice thing you can do is and I'll show you how this works is that if you got simple data streams like this it can also be a bit of a clue to actually listen to them because this is slow rate data and it's FSK you can pretty simply modulate this into your sound card and you get something that sounds like that and you know that sounds like it should so you can see down the bottom there as it's actually working I'll bring the the console over so you can see that so it's just getting this on one particular channel and it's constantly decoding that when it gets a message it'll it'll print that out but usually you know it's it's this sort of fixed frames that are happening all the time and then every cell when you see the interesting text messages so if you look at the flow graph itself then it's going to fit more or less so here we've got these various channels and what's neat is that in in this good way you can just click on one oh what's that special reach yeah you can just click on come on you can click on one of these and then you know see what the baseband looks like do that power stuff that I was talking about with the squaring and looking for the peaks you know looking for the board rate and so on so I'll I'll show you this in the newer version but initially I just thought well why not do something like once you actually click on one of these things the channels are really narrow so you have to be pretty accurate but you can't be accurate all the time so you want it to do some automatic tuning so if you look here you see how the green line is tracking over and trying to find the midpoint there in this case I was using that squaring technique with the peaks see even that that should get it the newer version does get it but that's still slightly off so what this is doing then is it actually takes the output of this fft and then it looks in one half for the highest peak and then uses that to calculate the frequency offset because this peak should be at 300 it should be at half the symbol right out from the middle and so whatever the differences between that expected peak and where the actual peak is that's how much it'll shift the frequency over automatically but the obvious problem with that is if you've got two peaks in one side of the spectrum then you won't know which one is the one to use so that didn't work but the implementation is is a bit fun if you can well it's a mess really but you go into this fft and then when you take the output of that fft these are some nice blocks you can use you can go into basically you take the magnitudes do the log of that and then you use argmax which finds the maximum input value and the maximum index where that value was found in the array that you end up passing through and then you hook the index into this probe signal and what the probe signal does is it just basically feeds on the incoming data and then here in this function probe you supply the ID of this probe signal which is probe max fft index and in function probe it basically just calls level which is a function that's exposed by probe signal and that gets the index what I guess five times a second or something and then when it does that you put it through one of these variables and then it calculates that as a frequency offset which then is fed back as a variable into this rotator so it takes the fine offset divided by the baseband sample rate and then multiplies it by pi and so it shifts the your baseband around so it lines it up but that wasn't so good because of those obvious issues I pointed out so let's take a look at how you can build up a decoder so the first step was the first step was decoding this as an FM signal because this is gmsk this is basically just moving your frequency around a center frequency to convey bits so you have your baseband coming in to quadrature demodulator which is your FMD mode you go into an FIR filter so I just made up some Gaussian pole shape taps there and what that does is you might have a really noisy incoming signal and that smooths it out nicely so that you have some symbols that you can then do clock recovery on and then you do this very simple subtraction here so this single pole IR filter is actually really a long-term averaging filter and so when you actually see the signal it'll be moving between two values hopefully negative one and one but it might be elsewhere but the point is remember how we were scrambling our incoming signal which means that it's changing between one and zero the the bits are that means that we have a lot of transitions and so if you average that of a very long period of time that should average to zero and so what we can do is we do that long-running average and then subtract that from our original signal and then we will get something that's actually nicely sitting around the zero point and you'll see why that's important in a second so this is the demod part of it very simple and then we go and actually look at how that's used here so this is the next hierarchical block layer so you've got the demod here and then you go into clock recovery so here I've over sampled by eight because it's quite cheap here at this low rate and then once you get your soft bits out so remember these are not hard ones and zeros these are soft bits because remember in that bottom diagram here when they were just above and below zero they were sort of you know sprinkled around there with that separation but they're not sort of your hard one zero bit that will go into this block so this is actually a python block i've been become really lazy so i don't actually make the grc xml definitions anymore i just want to use my block so i write the code expose it in python and then drop one of these any blocks in and here you can simply type in the python block that grc would ordinarily inject for you you can just do it do it in there manually and then that'll you know make make the block you set input output and then that's done so what this block is doing is it's looking for that magic sequence that begins at the beginning to mark the the beginning of a payload and then deen to leave them like i showed you with that interleaver diagram so it builds up the memory because we know how long the payload is and then once the interleaver is full it just spits all the bits out here to the next stage so nothing is coming out of this port until it's seen the synchronization sequence and then filled up the interleaver then it pumps everything out and then resets its state internally and what's nice is that this is actually just python code inside there's no c plus plus all right it's actually turns out to be the most expensive block in the entire flow graph so if you want to run it you should probably should implement a c plus plus but for the purpose of prototyping offline that was quite quite easy and then as you saw before you can go to this affect decoder so this is what's implementing that convolutional decoding and it's past an entire payload is worth and then it decodes that payload and then these are soft bits coming in and then it puts hard bits out so it uses those soft bits when traversing that trellis to find the most likely path and then once you've got your path decided it outputs the hard bits that make up the traversal of that path so once once you come out of this you've got error corrected hopefully bits from your original received symbol stream now our length at this point is 576 bits so so we go stream to tag stream which basically takes the first 576 bits and puts a a length tag on that to say how long then the actual stream is that packet in the stream and then turns it into here into a pdu which is like a message so so far we've been sending samples and bytes through the flow graph in a continuous streaming manner and at this point we turn our payloads our bits into a packet which we can just asynchronously send through the rest of the system so this now is sending a message to this this block here which is the d scrambling block again this is just a python block so you saw before that you basically need to generate a scrambled stream this block will do it too but this block also internally checks the crc so once it d scrambles it and makes sure it's a valid crc and then adds metadata to the message saying if it's valid or not if i was to use this outside i need to also implement another block to do the crc checking separately but at the moment that's all being done in there so out of this then you get a message that contains your individual signaling units remember we had six this will d scramble them crc check them and then break them up into the individual packets and then that's output on this pad zinc to whatever is using this particular flow graph here and at that point we're good to go so this is the the GUI here and this is a little bit simpler than before so we have this file which we recorded and then uh we put that into this frequency translating FIR filter so that's used to actually select the channel that we want to listen to and then again we do that tricky stuff where we square stuff just to visualize it anyway and then there are a couple of different ways of doing this but in this case we go into the decoder here and i've also got a d mod here and the reason why i have the d mod is i'm not using the output i'm just looking at the average value so remember how the d mod actually subtracted the average we could actually just run it by itself but because i want to visualize what the frequency offset is and maybe plot it or what have you i'm bringing this d mod in here and just using the output of the average here um and then in this case it's feeding back to the rotator which is doing the frequency adjustment for that uh just another way to do it anyway so the output remember we any messages and here we've got these two blocks one of them is the message parser so it's just going to tell you all the messages that are coming in and the other one is this user message parser which will specifically look for those user messages and then accumulate the the data and then print out the payloads and then in addition i go pdu back to a stream unpack those bytes the bits and then send it out on a udp sync because then there's another little flow graph that you can run that will do the ruster plot for you so we can see what what the structure of the data actually looks like and then in this corner is the audio output so basically you're taking in your your bits here you're resampling them to an audio rate doing some filtering i'm using a vco to generate the one of two tones um you know moving those tones and the separation between them and then going out to your your audio sync so it's that's just a nice little little visualized uh son of audio what's the what's the term to convert it sonification sonifying so have a look at this so this is the um that recording again and we've clicked on that particular frequency it's printing out the information there here um you'll notice that this is where i clicked on it in the previous one so you'll notice that this is zero dc but i clicked on it so it's slightly offset and here the green line is showing you where the center of this particular signal is and then we can use that rotator to rotate that back so it's exactly at dc again and good for slicing so how does that work well we're not using this technique anymore why is that so slow hello there we go um so this is what's origin coming you can square it and those peaks are still off offset here we've moved it back because we're showing what the output post rotator is and that's nicely lined up around the center marker there it's off the screen unfortunately but it actually has a little graphical uh text box that's telling you what the detected frequency offset is but there's a really easy way to do it of course we've got the board right there and it's exactly at 600 as we expect so let's look at this so you can see the green line is the is the dc average which is nicely at zero and then we have our symbol stream which is the blue line nicely shaped after the filtering that we end up slicing on so you can see that we've got these nice excursions and it all looks looks good um so let's look at the clock recovery and we can do that um the dots and you've got the nice separation there that all looks good so this is actually what's coming out of the fmd modulator and then once we do that filtering it turns that into that so you can see what an effect a nice match filter can have on your signal and the performance of your decoder um another nice view is if you use the histogram then these are actually the soft bits that are coming out of the clock recovery so again that this is a a sort of nice representation of that separation between a negative one on one so that's like looking at at this from the side now one thing i didn't want to show you was um that dc which i'll show you in a second that dc value and that little number that should show the frequency offset that's cool just to watch it and maybe you output to a block and then you you plot it later um and there's probably other ways of doing this too but i did i pulled this in a while back and it seemed to work quite nicely um i have a python module called real-time graph and it's been quite handy because it just gives you a really simple pythonic interface to draw stuff in matplotlib um and in this case you can just import it and i have this this variable here which is just an empty list which are the points that we're going to plot and then we have this anycode one and anycode is a block where you can inject arbitrary python into the python that this generates so in this case i just go real-time graph and the title is fine-tuning and i want to show it and that's it i just call call this this block graph and then here i use this anycode with this variable dependency so this code then gets fired every time frequency shift changes and every time frequency shift changes is when that probe is running on the average that average value that's used to compute the the dc offset so what that does is it just goes fine-tune points and then add the the shift value to that list and then set the data of the graph i'm actually using a semicolon there to do two lines of python in one uh and that just automatically fires every time we get a new point so what's cool is when you run it you get this is popping up on my screen here so i just move it over so this is showing you as the program is running that the receiver is adjusting its fine-tuning of the incoming signal and moving the baseband and now it's actually locked on the signal properly so that's about a 200 and what 250 and a bit ish hertz offset from where we clicked on this to where it actually needs to be centered up so that that's this offset that's showing you in that graph so how does that actually work well it's really quite simple so let's have a look over here what i'm going to do is instead of plotting the dc coming out of the already collected version go away i'm going to plot it coming out of here which is effectively um the same as if i did not rotate and just use the output of that but this is what's what's sort of coming out actually what i'm going to do is turn all that off so disable that disable that disable that and then i want that and then i want that okay so what we're going to do is we're going to turn this rotator off effectively by multiplying frequency shift by zero so we never rotate this incoming spectrum this is just doing the averaging internally but we get to see what the output looks like so remember before we had that green line around zero and we had the blue moving between is that the right output average and filtered let's do the mod okay there we go so what's interesting is this is the signal before the filtering step so it actually looks really noisy as you can see but as it turns out when you average it when you take the the long-term running average of the filtered version of the signal you see how the green is actually up here instead of zero that's because although once you've actually done the quadrature demodulation although it's it's worked and it's done it and it's still within our bandwidth here you can see it's still within our our bandwidth this just off center this effectively moves your one and zero which would nominally be you know here at negative 300 positive 300 since we moved it it actually adds a total offset to your output signal which is what the green line is so because we've moved it in frequency now we've got this dc component that's also moved with it after that long-term averaging and what's nice is that we can probe this value of this dc and then convert that back into the frequency offset i mean this is linearly proportional to that frequency offset so that's why it's going to that probe and then you can do that little calculation and get so that dc offset back to hertz um so that's that's that and it's just a it's a super super easy way to you know correct for frequency offset um but i just thought i take you through it so you can sort of visualize what's going on and see how you can put that that together and so that way now we've got one two three four five six seven eight nine channels there what happens if you want to do them all at once we just picked one let's do them all um and again i've got certain members of the audience that probably have far better ways to do it like with a polyphase channelizer that's what all the cool kids are doing but um i wanted to use something that didn't require too much too much thought let me just show you this one still on the screen oh did you see how it sort of found it and there was those sorts of bumps in it they're the sorts of frequency offsets that you can sometimes encounter larger than you might expect and which is why you need to have this sort of uh the setup to do that fine frequency correction because remember out two symbol points are only 300 kilohertz in this case normalizes uh either side of the dc so if you have some massive frequency shift it's even just temporary you it's going to end up corrupting your data because you'll have all ones or all zeros so let's do them all at once how to do that well um a while back i created this sort of multi-channel decoder idea where you could specify the the sort of singleton instance of a decoder and then you would give that to the multi-channel decoder and would spin up that many versions of it for each of the channels that you want to actually decode so this is rather a computationally expensive way to do it um if you're going to do it and and i might go back and try using a polyphase channelized approach that's a computationally far less expensive way to do it that's that's right tom right that's the way to do it but there are various assumptions and restrictions around that well yeah but i suppose i suppose yeah but but actually being able to set programmatically how to combine them so i mean that would be the python that i'd need to implement yeah yeah yeah yeah but that that's the cool way to do it actually if you look online tom and others of tom rondo by the way is the uh the former now maintainer of the gunnery project he spent many years really really adding great documentation and and fixing up the framework and and making us what what it is today so big props to tom uh but in this case just you can consider this approach as well so what i've done here is i've created this sort of template and so we would have our baseband coming in from our receiver we translate that to the frequency where our channel is and then we just plug it into our page child decoder as we had before in that single instance of it and then we just send the messages back out and that's all it is it's just move to your channel and then decode it and then aka tnt has also done some pretty amazing satcom style hacking and some really advanced gear radius stuff too so definitely check out his work but once we've just defined that template um this is then all you need to decode all of the nine channels there so i just have a list here that contains the frequency offsets for all of the different all the nine channels there and then here you just connect the usurp source into the multi channel decoder and we tell it that we want to use the in mass that arrow p channel decoder we give it a frequency list and then we hook the output port into the user message parser and so what this will do is when you press play this will spin up nine versions of the decoder hook them up to each of those channels and then just let it run so once we hit play then we get uh we get an error naturally apparently something's still running oh that's that's old hmm give me a second what's that no isn't no i was here before no i'm seeing a trace back running this in one of these seven windows we'll just try it here what is this saying oh that was kind of obvious i haven't got a usurp so actually here's something i prepared earlier i've got that same recording i was off the screen which is why i didn't have the visual cue the reason why i have the usurp is because um this is what i want to show you with the internet but i since we don't have it we might come back to that later if i can set it up but probably won't so i won't get hacked um my friend ian buckley he set up the the dish and everything he's actually got an odroid which is a small embedded eight core arm with four big cores and four small cores in uh enclosure next to that dish plugged into a usurp which is hooked up so that whole thing a little thing sitting there running all the time and it's running this and it's just streaming messages and i've had it logging to a file um and it's just been storing all sorts of stuff there just to you know see what turns up but what's kind of nice is that because of the low bit rate and so on um it can actually do all the nine channels even with this expensive non polyphase approach on the uh embedded platform so if we run that there we go so you can see that it's it's saying that it's you know created all the decoder instances at all the different options each of those decoders is now finding the unique synchronization word and um they're not printing out the generic messages they're just accumulating uh user messages and so you can see oh thanks well i very kindly had a a usurp brought up um that's great too the only thing we're missing is a dish with a circular polarized antenna feed an lna in a bandpass filter which is why i wanted to log into the box oh yeah we can make one you got uh i don't know what 40 minutes turn that chandelier into a dish but we still need an lna in a filter but thank you very much um that that's that's come and gone so this this recording is really short so actually we don't we don't get a lot of stuff out of that but um you know it's just i guess my goal here was just to do a really clean try to do a cleaner implementation of it um and so yeah i'll put all that up on my github and then people can can play with it um and then you know if anybody wants to look at using the p-channel and synchronizing to it then you can go up the layers and look at look at the rest of it and maybe get to voice but that's for another day so any questions on on this in my sad stuff before i move on the next topic yeah so a lot of them usually have the regio of the aircraft so you can actually look at how often and where they pop up um you can also then infer i've seen messages uh i maybe i can connect to it at the end or maybe i still have part of the log open in my console uh here it is uh it was running in screen and um but again you get you get messages that contain waypoints so that you can derive the flight paths you can when you have the regio and it gives them information about you know whether an airport uh state then you can determine where they're actually flying to and there are a lot you know all the weird sort of human readable messages that you wouldn't expect to see that you know identify people or or you know interesting actoids about the aircraft or whatever um so it's just kind of another another tool in the the plane spot is arsenal um but it's just cool because usually you get them from a terrestrial network in this case you're getting them from a satellite which is actually servicing a huge portion of the earth so you're hearing aircraft any aircraft in that in that area potentially um so this is super simple compared to some of the other stuff like um sec came to cyber spectrum last night oh there he is up the back with the cool radio badge um but he he's done some absolutely incredible work with um the iridium constellation you know wishnider and and and some other guys um so definitely check out their talk from the most recent hope conference um really really stunning stuff like way beyond this but i guess i just wanted to show you how you can sort of easily get into saccom using this kind of technology um so what else have i got for you today oh we had the raster sink too this is just uh this was that little tool that i was um telling you that you could use to actually visualize the the data the plots so here it's just a udp socket hooked up to the time raster sink that's all it is really basic but it's a very powerful tool to enable you to look at the bitpans so if i run this maybe so that's running do we have a signal yep and then once we start spitting out data packets there you can see that as it's actually decoding an entire payloads worth um you get this data coming out and this is showing you the bits post de-scrambling um so this is sort of the looking for repeating patterns with the actual raw data the uh data on the over the air um you know you could equally just hook up your your udp sink where is it udp sink right here so this is just looking at the output of messages you could easily just take uh you know this u-charta float and then or just take the floating point output of say the the demod or or uh yeah the the demod and then we can actually see what that looks like so this is before any of the decoding is actually happened let's see what that looks like where's my did i close it so again that's looking at the raw or demodulated symbols and i probably hooked it up a lot wrong thing because that looks like noise it's per day it is per day demod filtered um oh sims i meant to hook it up to sims i beg your pardon the the symbols coming out of the clock recovery so let's try that like this oh what's wrong port is not connected which port is not connected scope sync what have i done ah this is no longer connected this was connected to this okay yeah oh no that's the other monitor a little bit yeah they should be the the soft bits which is why they're not just two distinct colors um i might have not set my width correctly i thought it was 1200 maybe that's not it's not five seven i thought it was 1200 no no it was one five two i can't remember it's one of those but in one of them you get exactly that plot that i showed you in the in the presentation before um so again that that's going there you can see that it's you know making these subtle adjustments every cell so yeah another another cool little tricky new to visualize stuff all right so that's that's in my set um let's um yeah there's some stuff i want to show you remotely but we might might leave that for another time uh okay um so i'm i'll i'll do a quick little sort of fun fun tidbit and then um we'll look at some analog ntsc video decoding um one thing i wanted to have more of it done for today unfortunately it actually turned out to be a lot harder than i i thought it would be but there are some old signals that are still used and um are really quite quite neat and and um cool one of them is um um frequency modulated um continuous wave radar just trying to find where i put it um anyway i'll show you a quick demo with what you can do with it so in this case uh in a lot of radar systems what happens is um and i've spoken about this before is that you have like at the airports you have the large spinning radar and that sends out a very powerful pulse of energy and then it waits for echoes to reflect off and come back from say the fuselage of aeroplanes flying through the sky and other objects that will present themselves as a significant radar cross section they have that huge antenna because it's directional and thus you get quite a lot of gain uh so that you can receive that very weak signal as it as it bounces back in that particular case they're basically just sending out a very powerful quick uh pulse of energy at a single frequency but then there's another radar system called FMCW where instead of just sending out a one frequency you can actually linearly or otherwise increase the frequency from a low point to a high point and then jump back so it ends up being a sawtooth and then you generate what's called a chirp so this chirp goes from a low frequency to a high frequency and then just keeps repeating and that actually has some very powerful properties that you can exploit to very effectively create a radar system and this is used in all sorts of different environments but the nice thing is that firstly because you're effectively when you're decoding and demodulating it that chirp is unique and so effectively you're creating a match filter for it so other noise will then disappear and also when you run the FFT of that the frequency shift that you get uh is actually your range information for your targets so when you do that convolution you actually go from the frequency domain into the time domain it's really pretty powerful but you can do something like this so um a little bit too big for the projector so this is this is running and we're all about SDR but you don't even need an SDR to actually play with this you can see that there's some stuff happening there and i've muted the speakers on my laptop but i'm gonna unmute them can somebody give me a piece of paper just like does anybody have a piece of paper maybe i've got one i just need my laptop the laptop's all right um check check i got another two all right have a good let's get a good good setting here that should be pretty good unfortunately you can't see the bottom but you'll you'll get the effect this is good the range you know it's not it's not that not that fantastic but anyway you get the idea so what's actually going on here uh this is a flow graph that generates an FMCW chirp and then plays it through the speakers on the laptop but then it's simultaneously listening on the microphone and it's doing a transformation then i mean this is effectively an FFT here it's doing a transformation that looks for that pulse and then plots it out and the reason why as i change this distance between the the laptop speakers and the microphone effectively uh you're changing the time delay for that chirp to bounce off another surface and come back and because that uh can go through that convolution this is an FFT plot but the bin that it falls into is actually directly related to the range information or the propagation of that chirp from one point to another one and um you know these signals that the microphone speaking of the quiet noisy so you get this kind of stuff but um in other sort of systems you can actually process and filter that out pretty effectively um so one other thing is that this doesn't work very well because again this is just hardware speakers and a microphone but if you listen carefully there are two speakers in this laptop right which means we can create a phased array and try to get angle information out of the target space above the laptop so what happens is if you listen hopefully you'll be able to hear it but it sounds pretty uniform right now so what i'm gonna do is i'm gonna click this button here and then it's going to phase from the left channel through the right channel so it's basically gonna sweep and what that's doing is it's changing the delay between the two wave fronts that come out of the laptop so when you modify them when they come out at the same time it'll go directly up but when you offset one for one to come out a little bit sooner than the other one you're basically going to add them together as the wave fronts propagate and then beam form it in one particular direction and that will then sweep through so you basically change that that delay there and so you effectively get a wave front that propagates basically like like this um and again it doesn't work very well because you know you got your microphone inside your laptop right next to the speaker but so have a listen you might be able to hear what's going on with the stereo effect you hear that so um what that's doing is it's processing the f of t's and then i've fixed the rate at which it's doing this beam forming and every time it actually runs one transform that builds up one scan line and then for each of those beam formed angles it produces you know each stack scan line so one of these pictures is supposed to be a full run through the through the um beam forming range so i mean the idea is that if you were to you know hold it here say uh you've got to sort of get it right it's it's very very finicky but the idea is that you would see um this is from on the horizontal axis here the x-axis this is still range information but your vertical axis is now angle information so you would expect to see sort of a hot spot at the angle at which your your target object is actually sitting at and again maybe the transfer function isn't quite right but oh there we go yeah see there there's that is that a fluke okay so actually that's kind of working but i have to stop talking um you can see there's that line that comes down i'll just describe there's a line that comes down and it gets a little bit weaker the edges but it's a little bit stronger at the center point or just offset from center let's see see it's slightly stronger offset from center and that's because i'm holding left or offset from center maybe i don't i don't know how well this is going to work before hold it here yeah a little that one that's quite close on the left hand side so it was around about here i think anyway you get the idea um but again that's just you know super simple with this what's nice is that you can get a user like a like a b210 or something with two coherent outputs get some cans point them somewhere and then get this to do exactly the same thing but in the rf domain and um you know you can do fmcw and look at Doppler shifts so you can you can do normal cw and just look at Doppler shifts but if you use fmcw then you can do this phasing and also get range information out of it too quite effectively and i started doing it very quickly at the end of one cyber spectrum with some cans but i think i was getting overruns and stuff but again um maybe for can you do any conference um so how does that actually work well it's um it's pretty pretty straightforward there's no internet connection uh let's see i know what i can show you oh it's not open what's the document called here we go as you sort of do research and and read things you basically find out that everything cool that has ever been done in computer science or dsp or anything it was all done in the 60s and 70s i found this paper i'll show you in a minute why this is interesting um that talks about fmcw radar and what i want you to imagine is and i'll show you what this actually looks like in board line in a second what i want you to imagine is that this solid black line this is a paper by barrick from 1973 about fmcw um this black line is actually what's transmitted by your radar system so you've got time running along the horizontal axis and vertical is your frequency so you're effectively making a tone that goes from a low frequency to a higher frequency um and then this dotted version is actually what you're receiving and what you can consider is if you're transmitting this waveform and it's bouncing off an object and coming back to you you will essentially see the same waveform but delayed in time a little bit and that's really nice because this waveform because you can create a match filled for it has really nice correlation properties so that you can ignore noise now what's pretty cool about this chirp waveform is that on the receiver side you can create the same chirp basically take the chirp that you transmitted and multiply that by your incoming samples from your usrp and if you end up receiving this signal or a delayed version of it you'll effectively undo that chirp nature of it and then get a tone a single tone and you can take the fft of that and then in the frequency domain you get a bin that lights up and that's your range information i'll show you that what that looks like in just a second um but that's the kind of waveform that we're dealing with here and we can realize that pretty easily we've got our um i'm going to start up other topic so you've got the audio source which represents the microphone and then you go through a Hilbert transform and turn it from floating point into the complex domain and the reason why we do that here is because we're going to multiply it by the chirp that we generate um in this block here but what we what we do is we have this signal source which is the sawtooth waveform which is exactly the waveform that i just showed you here sawtooth and we have this particular frequency here i'm running it at 50 so that noise that you heard before that was actually a chirp running at 50 hertz so you hear 50 chirps per second and then that's generating a sawtooth waveform which is then going into this vco voltage control oscillator and so this is actually producing the chirp signal so what you can think about is the sawtooth is controlling the frequency of the waveform that we want to generate and the vco is actually generating that waveform at the frequency so this is constantly doing this sawtooth and so our frequency is constantly going from low to high and then low to high and then low to high so what we're doing is once we've generated the chirp we mix that with our incoming signal and then it goes off down there uh let's see so that goes down down here so we've got our sawtooth and we're also going into this vco and oh this is the phased array component so i'll come back to that in a second um so what we we've done is once we've actually multiplied our incoming from signal from the microphone by this chirp then we go and we put it into this multiply block and i've just done a bit of trickery here i can't even remember where this is this is a restriction anymore because when i do it in python i don't don't worry about it but i'm trying to do an f of t they're restricted to powers of two here right you can't do arbitrary size f of t's all right so that i've over complicated things but what's new um so i was trying to make a power of two f of t so what i'm actually doing here is i'm padding out the samples turns out that you don't actually need to do that i mean you don't need to do that when i've been using numpy to do some offline stuff and you don't don't need to do it so here i basically have a window usually the window is applied within uh the f of t block but here i've windowed the samples that i've received which are shorter than 1024 and then i'm padding the rest out with zero so i'm effectively bringing it up to power two um and then the stream mux is combining the samples that i've brought in a windowed with the padding on the end and then it's going into the f of t um so that's you can you know that works too and then i'm i'm running the f of t and then i'm chopping off half of the f of t because we're just interested in the in one side and the magnitude information so again i take the magnitude do the log of it and then i decimate that stream so i'm keeping one in every 512 transforms because this sort of modified sync uh can't accept too much data otherwise the gooey hangs it's just a weird os x thing um and that's how we actually then get that visualization of the range information so that's essentially what you're doing you're taking the incoming signal multiplying it by the chirp that you originally sent out take the f of t and then you go get range information and then what about the rest of it uh what about the phasing so i've got had that checkbox there so the checkbox is this runner a it's either a one or a zero here i've got the signal source and this is a sine wave but it's actually set at a frequency of 0.6 hertz so that's if you imagine don't think about it as a sine wave think about it on the uh complex uh yeah unit circle thanks think about it on the complex of many unit circle and what you can imagine then is a phasor that's slowly rotating around and it's doing that rather slowly because i've set it at this really low frequency and usually um usually since we're not running the array we just take the output of this vco which is our our chirp but if we want to phase it then we can do that too so when we're not running the phasing we want the same signal to come out the left and right channels of our speakers so if we go and look at the audio sync here uh the audio sync oh for some reason i have this all over there this is a multiply cons block so it's basically just getting the signal to a nice level for the the audio sync but here we've got our left and right channels and once we've created our chirp we go through and just take the real component put that into the left channel and then with the array um we essentially have oh that's what i forgot so in here in the signal source the amplitude is actually set to run array so if run array is on we're actually going to get samples coming out of this with all sorts of ranges of values but if the array is not running then amplitude is zero so this is going to be producing samples but they're all going to be zero which is like as if it was turned off so if the ray is off this is producing zero samples we add one to it so we actually get ones coming out here then we multiply that by our chirp signal so then we get the chirp signal happening with the same phase offset take the real component and put that to the right speaker so we get the same signal going out left and right if we click that shape box suddenly we get useful samples coming out of here but when we add we're actually adding one minus array to it so that actually swapped what's going on now so now this is going to be producing valid useful samples but this is going to be producing zero so valid samples plus zero is a valid samples then you multiply that by the chirp signal and although this is a slow sine wave this has the effect of changing the phase slowly of our chirp signal and once you then take the real component put out the right speaker we've got two chirp signals that are offset in phase one is one is a constant phase and the other other one is retarded in phase and then forward in phase as they come out the left and right speakers and that's why you get that weird stereo effect because we see um you can hear it on the side of your your head uh and that's all all there is to it really pretty basic no complicated processing beyond doing this chirp in an f-15 so it's a it's pretty cool waveform um now i'm trying to find uh some photos here where did they go so this radar i mean in in this case it's used to do you know a bit of fun stuff um and in other situations if you actually look at uh the fm band let me show you what that looks like i'm going to open uh board line here for you get some signals let's see something not huge oh jeez we're gonna board over here and it disappears okay now that's a floating point let's see what happens we get cool i'm just gonna resize this so i can access the whole window so this is board line uh board line is a really cool piece of software that you can use to to offline analysis of the files with made by a guy called Eric Olson and um this is part of the hf spectrum that i've recorded and let's try to get this all on the screen here um and i've just gone out with a with a long wire and um attempted to to receive things over hf and uh let me see if i can get a photo of that no not there anyway um what we're actually looking at here now this is time on the vertical and frequency on the horizontal this was at um what frequency we got here is it 13 meg there should be three three megs i think oh yeah no i'm using a transverter so you can get a long wire to loosen the hf and then you can plug it in who's got a hammered up who's sort of the hammered up converter yes you can get a hammered up or other sorts of up converters it's essentially a mixer and then plug it into your normal sdr to actually receive these hf signals you've got you know am radio stations hams talking on them um you know psk 31 all sorts of other interesting uh analog digital modes but then you've also got these chirps that show up and as it turns out there's this system called coder there's a company called coder that makes this hf radar mapping uh system and you can see these characteristic antennas around the place and they actually set them up to map the ocean currents so they're set up near the uh the shorelines and they actually just set up an antenna and they're constantly sending up this chirp and then uh those the system is specifically tuned so as the wave front travels out over the ocean it will actually reflect off waves and those waves will reflect a small amount of energy back here but also as the wave is moving either away from the transmitter or toward the transmitter then that will actually impart a Doppler shift which you can also pick up after integrating on this chirp for for some time now um you can do all sorts of interesting experiments with this if you know the guy that um creates aerospace radios and stuff on his blog he's done some really cool processing of these sorts of signals um once you start doing this sort of thing it's really important that your receiver is locked to the transmitter but because you have no control over the transmitter uh usually if you're lucky the transmitter is locked to gps so if you can get an sdr that accepts an external reference input you can actually get then a gpsdo gps discipline oscillator that will output 10 megahertz that's disciplined by the uh atomic clocks on the on the satellites and then that will bring your sampling rate into alignment with the transmitter which is what you need to then very accurately process the information and then extract Doppler information as well but this is what the waveform actually looks like and you can hear these things um it's kind of running out of time so I don't know whether we can necessarily listen to them um but what it looks like on the spectrum as you saw before was this sawtooth wave so in this case this is a a down up down chirp frequency is going down and you can see here that you've got these lines and if you actually zoom in there and we can also zoom in in terms of frequency where is it is that it you got all you know that these are the cool signals there so there's there's our a chirp and um so what's interesting is that you got what looks like this main main lobe that the central signal and then you got all these side lobes there and the reason why that turns out um peter who who does our space actually had this on his blog really interesting is that this is actually aming of the signal due to the fact that the transmitted signal that they're transmitting is actually gated which means that they're not transmitting continuously they transmit a little burst and then they stop the transmitter switch to receive mode listen and then switch back to the transmitter which is what you know for example the primary surveillance radar does at an airport and that makes sense because they're using the same antenna to transmit and receive so that's what's actually producing this aming effect in the frequency domain if i won't do it now because it it can be a bit of a pain but if you were to actually look at that signal here we go you actually to zoom in on that signal even more and adjust your windowing function what kind of stuff each one of these chirps actually looks like this can you see that it's actually dotted line now so that's actually the gating of the signal it's turning on and off and when you do put that in the frequency domain you get that aming effect but that's okay because we're talking about well not quite but almost the speed of light so if that reception time if you take that reception time multiplied by the speed of light if that distance is within the virtual range distance that you want to monitor like you know to the way the actual ocean waves then that's all you need to do and that's fine but what's interesting is that these signals are out there and you can actually take these signals and process them and produce these really cool plots which i'll show you i wanted to have more done but didn't didn't quite get there but in terms of the actual setup what you actually need to do here are some some photos so we actually went up if you're familiar with the the area in san francisco skyline boulevard there's that road that goes up and it's quite close to the ocean you need to get away from civilization because everybody's switch mode power supplies absolutely destroys the hs spectrum as it turned out we have to do this entire operation purely off battery power no inverters so i had some dcdc inverters to power the usurp and when we turned it on the spectrum just went to to junk we ended up getting these lantern batteries and using those actually gained parallel to power the usurp the laptops just ran as long as they could on the internal battery power unfortunately because their macbooks you can't just pop another battery but not that you could exchange the battery while you were recording anyway but i got about an hour's worth of of recording on that which is pretty good and we've got two setups here we've got a usurp n210 connected we've got a basic board plugged in there with a band pass filter so that's just receiving it straight in there here this is a usurp b210 hooked up to a hammered up so we had two antennas there and we were really cold and the mist was coming in and we had a long wire antenna and a so that's kind of the setup there a long wire antenna as well as a dipole that i i brought uh and you know so the we've got a we've got an lna here and some other things but everything's powered off batteries like the the hammered up is powered off and i think well that's off usb but there's also nine volts powering and lna and and you know really really basically any and then oh we then we also got a a one ten inverted to power off the car battery to actually try attempt to charge the laptops with and as soon as we plug that in you have to make sure that when you plug it in you can't just verify and determine what's going on just by plugging in you actually have to put load on those switch modes supplies so you plug it in nothing happens when you actually plug in the laptop to charge so then everything everything disappears into the noise um i don't have the photos on here but i have these two um squid fishing poles you can use them the telescopic put them up and then basically just you have got a long wire hanging off there um and you can hook it hook it up and then that's pretty much all you need um and so what can you do well i've generated some um python scripts that you can use to actually um transform this data so i won't demonstrate them to you but i'll just show you the sort of progression here actually another another guy that is pretty much the expert in this field is um is yuhar do you know how to pronounce his last name and he he's done some absolutely amazing experiments the reason why this is cool is because they're generating a signal that they're using to map the ocean but it's also very powerful and because hf it can also go up and bounce off the ionosphere so you can actually use this especially at night time to look at ionospheric effects of the signal because effectively the ionosphere is becoming a mirror to these hf signals that somebody else is producing and then you're using essentially passive radar techniques to then visualize the signal and the effect that the natural world in the sun is is also having um how that's affecting your your thing so this is a a plot a plot um i got some python code i'm going to put it up but this is essentially an fft i wrote this because i wanted to create really really big ffts um and you know not have to you know use a tool to and muck around with a tool just have it spelled out a massive image so this is actually part of that spectrum again and you can you can see that there are these various different codar signals at different frequencies that you can pick up uh and then once you do some clever processing you can basically find the bandwidth and then you can use um you know some simple dsp tricks and do autocorrelation on the signal to find the radar which is actually running the chirp because it's a repeating signal and then you can create um a matching chirp and then filter that down so you get a plot like this essentially remember how we had the aiming effect and we had those side lobes because that you have that frequency that's chirping down going from high frequency to low frequency if you match that with your own chirp and you mix it down you then get a straight line so that main lobe then just becomes this straight line here and then we have the other side lobes that come off there and the the way this is running is this is actually an f of t that's running from the left to right and so this is time here and each f of t is the length in terms of samples of one chirp and that's how then you can produce these nice plots and what we're interested in is what's going on in this band right here because if you consider the speed of light and the distance that it would traverse over the length of a chirp we're talking you know multiple times around the world and and then some we're only interested in in stuff that's close and effectively up to the honest fear which can be a virtual range of a couple of thousand kilometers so that's just in this very small band and if you zoom in there then it ends up looking a little bit like this i'll show you another plot in a minute but um oh time are are we up or is that it five minutes is good um so we are effectively looking now at you know whatever propagation and wave fronts are coming back and we're receiving them all here and let me try and find another plot the cool thing is that you can actually see stuff that's actually moving maybe not in that one let me find another one here ah here it is have a look at this which is a good one so again we've got our our sort of um they can't see them due to aliasing there but we've got our side bands there am and if you actually zoom into one of them you can see where was it this thing so this should be straight and in this case i wasn't using a gps do and i have to go back and reprocess this everything and do some fine frequency offset i think i calculated that at each step i was losing like point zero two hertz so even in that tiny amount of frequency offset it ends up actually producing this visible offset over time which is why you need to lock and and integrate and make sure that you round everything up but um you know you can see stuff moving there in the middle there's a better plot and uh fortunately i just can't oh that's the one so here you get this really interesting fringing effect and you can kind of see here there's this discontinuity because i had an overrun so everything all the timing got desynchronized but you can see here that you know stuff is actually moving because this is a range plot and these intensities are moving in there there's something out there that's actually changing its virtual range with respect to my receiver and the transmitter so things are sort of you know might be getting close or are moving away and you get you know these sorts of effects there i'm not i haven't looked into enough i don't don't actually know what's going on here i have a feeling this is probably more likely the ionosphere than the ocean but it might be a combination of both or something else completely so if you know if anybody has any ideas please talk to me afterward but this is a continuing area of research and when we went out we used gpsdo's i have to still you have to process all that data and optimize my code a little bit because it's like many many gigs of samples to convert but anyway this is just an experiment again a little feel for what you can do um and in the last minute i didn't get time to show you um but you can also do analog video decoding um so fpv on drones ends up actually using ntsc over fm but that's the one that i've seen anyway and so when you you can write a flow graph i was mucking around with this a little bit still very experimental um let's see so my vertical sink needs some improvement there but i've effectively this is an ntsc waveform coming in and i've got two match filters that look for the vertical sink and the horizontal sink and then it it tries to line everything up and then give you a picture i didn't actually see color burst in this one so i didn't have any color information um some and i'm just plotting luminance information anyway um but usually i found that if i just have to tweak the timing on some stuff um but i you know there's some weird blocks i had to add like certain things to look for peak detection and whoa but you know it's it's kind of getting there if you if you take out all the special synchronization stuff you get a solid image but then you get a roll in both vertical and horizontal because the timing is not quite right but anyway that's ongoing work too so if you've got fpv i think someone was going to bring an fpv thing but anyway um it's an old school signal so i'll i'll wrap it up there thank you very much for your attention especially for that amount of time