 Okay, we are about to start topic about Turex toolkit and virtual virtual human interface so What is Turex interface is? it was introduced a Few years ago, I think long long time ago by open BTS project And it's a custom interface between layer one and transceiver. I Call it layer one because now it's not only a base station, but Osmocon BB may also act like a layer one and Yeah Basically, it seems to 3dp sockets be used Are it asking me not to call one of them control because it's already used It's a Turex control protocol. It's asking base and looks very similar to HTTP protocol You just send some response. For example, please tune to this particular frequency And you will get the response and even the response code like an HTTP but zero means okay, and there is also a clock protocol which allows you to get clock indications from hardware hardware clock indications and The last interface is data and here you send and receive uplink and downlink burst What is Turex toolkit is a set of tools I wrote in Python because I have been doing some development of Osmocon BB and Turex clone I will talk about this Later, and I had to test because at that time we had no We didn't support Transmit capabilities in TRJSM, and I had to test something Somehow test anything I wrote and This is why I written this and I Found out that it could be also very useful for different topics for example in general it's time to help with debugging of Turex interface and The first thing I wrote was to fake Turex, which allows you to connect Osmocon BTS it basically basically emulates transceiver And it allows you to connect Osmocon BTS and Osmocon BB together Without actual radio hardware So it's basically virtual UM interface because you're sending and receiving bursts here there's also some some supplementary tools for command ejection for example You can capture bursts on some interface but for example between base station and transceiver You can inject them you can generate random bursts just for testing And good news is that this part is more or less documented and now merged to the master branch of Osmocon Thanks to Harald for helping with this and The how can how can one use Turex toolkits? It's as I know it's now used by some testing setups in Osmocon project and In general it allows you to learn and research JSON without hard Yeah, I will try to repeat and In the bottom of this picture you can see the classical way of communication between JSM layer 2.3 application like mobile or ccch scan With the firmware where Osmocon is like a proxy between serial ports and layer 1 control protocol Which is carried over a unique socket and also connect TTC and 3 test cases to both Osmocon and Turexcon and Turexcon is also the application I wrote it's implement some parts of layer 1 for example like scheduling and burst coding and decoding and Then we get the most interesting part is here We have fake Turex application which emulates two Turex interfaces one for Osmobts another for Turexcon And for Osmobts, you don't need any modifications because it's like a regular transceiver like Osmobts in the Osmobts doesn't even know who is it and So one so yeah, you can connect them you can inject some commands using a simple tool called control commands you can for example Generate random bursts and inject them for example rage burst And also one interesting feature I have is Turex sniff It allows you to capture some burst on the existing Turex interface record and if you want to replay them using burst send tool and And if my use case of this tool was that I I had to capture I had to understand how basically TCH half-rate burst encoding works And I captured some flow of frames on some particular time slot and yeah and this is how it works in more details and fix Turex is also acts like a simple clock generator and as I already told control and data interfaces Ported The status of this project is we now have some simple simulation and randomization capabilities of both timing of arrival and RSS site we have support of burst capture burst sniffing and burst replay then and yeah The the future plans are Make a low multiple base stations to work with multiple mobile instances And this way we will be able to simulate some load base some I don't know stress test stress testing for base station or so on and then yeah bit Okay, thank you for your attention and if you're free to ask your question. This is a little story about my Way to the Osmo Defcon Yeah Feel free to append your questions to the array of questions and I will try to find the answer for you Well, not a question just some comment I think for testing there's much more that we can test using using this than what we do so far So at the moment we use it basically just to to generate valid messages mostly signaling messages from the TTC and three test case into the BTS and then we check whether we receive it on on the Aver side and vice versa and the other thing we test is I think timing advance and Receive power. Yeah, but it's not implemented. Yeah, remember. Yeah. Yeah, it's it's very basic there But I think there's a lot of things that we can test for example also in terms of DTX Testing Because it's very complex. It's I think the most complex state machine we have in the entire Osmo.com project is AMR DTX in BTS, so I think there it could could help us to really Test all the different cases and see how the systems behave in in in all these different state transitions So it's very useful Thanks Very easy question to answer for myself, but I'll ask it anyway. What what's What is this? CPU usage of the of the fake TRX is it is it heavy on? I measure this and one for example if you would Use all time slots of a single base station. It will be on my CPU. It's Intel Core iPad I don't know if I'm sorry. I don't remember. It is about 30 persons 13 Or about 10 persons not too much Yeah, I mean, it's because there's no upconversion and anything like that and no no actual RF samples But it's just the baseband burst bits. So it's basically the encoder and the decoder that are Mostly the decoder probably the Viterbi that's using the CPU. Yeah, the main application of Factorix it simply receives the utp packages Changed a few bytes and header and send them forward them to the beta BTS and vice versa Not too much to do If you want you can use pi pi or Rewrite it to see it's not required. I think no, I mean, it's not the fake tier X I think the question was more in general in the virtual setup, right where the CPU usage is all fake Trix specifically because Yeah Well, it's Yeah, yeah, it's because the the things that the TRX does are not done really here Right, it's that the the Viterbi decode and so on happens inside Osmo VTS TRX And that's the same whether you have a real network or not But all the the signal processing Sample speeds that is not done here in the setup Viterbi decoding is also happening in a tier X cone application, but it's now we support some SIMD instructions and it's works really fast any more questions So if I have some time I can I would try to put some One more small talk and it is about GSM audio packet knife Because it's also a little bit related to the work we did if you Because I used it used this in Osmo BB project and there was some discussions about transcoding in Media gateways, so probably it would help someone I hope yes So what is the JAPK many people here asking me about this what is Four letters and it stands for JSM audio packet knife. It was initially written by Sylvain Okay, I need to finish and It is it can it supports the formats in coding and decoding for example you can I Don't know use a mirror or GSM fails in different formats. It supports Classical GSM codecs like full-rate half-rate or AMR Some some work was done by Harald to make it support also So you can use your audio subsystem on your PC to play and capture sound and Also some work was done by Harald to support RTP streams, but it's unfinished. I think and The use cases for of these two is for example transcoding one audio format to another It's a I think it's kind of ffm pack but for JSM codecs and you can yes perform transcoding between for example full full-rate audio into AMR you can Listen for real-time RTP streams and do any transfer if you want What I would like to say About the previous state of the project is that it was just a simple command line tool and There was no testing coverage. We only had some samples which We And what we expect after conversation and that's all we had and We had no shared the libraries and it was a single binary and it was Not that useful for me personally. So here you can see the example how to for example convert one file to into another and Why not to turn this into a shared library? It's I think it's a good idea and Audio coding support you can for example use this library for audio coding in Osmocom BB mobile application You can we are currently doing some work of Implementing jr. JSM blocks for a new radio to support Audio coding for JSM and it could be also used for Osmo media gateway to Transcore streams and probably someone else would like to use this What was what was done by by me is I Created a new library called Osmo JPK We just just Some some some work to put headers in common for Osmocom Projects location. There is a separate directory Osmocom JPK All symbols all exported symbols Have Osmo JPK prefix now It's requirement. We have package config manifest so you can easily compile your programs together link against this library and Yeah, we used and now we use a TALAC subsystem for managing memory management and We have some little bit better testing coverage, but not not full yet And what was changed is that JPK was renamed to Osmo JPK because many programs Have this prefix and it is linked against Libosma JPK Now we use Osmocom login framework instead of a simple printer of course and Some basics about the architecture of Osmo JPK it is It looks like a new radio flow graphs So you just need to create a processing chain. It's called qui in terms of JPK and They are Not that complicated like you can create in new radio, but you have only source blocks Processing blocks and sync blocks and that's it. There is no scheduler You just need to iterate through the chain yourself There is some IP for that and you can allocate a few processing chains You want to work with them somehow and yeah Here is some example how to create your own block because for Osmo ComBB I have some little experience of creating my own block and I will show some simple example So all you need to Describe is how much data are you going to receive? from different blocks in case of source it's zero How much data are you going to pass forward to another to another block and We have some internal state if you want to store something for example file descriptors or Sockets in case of RTP There is output buffer where should we write data? We just finish which we protest and there are two or even three I don't remember exactly callbacks. So one is Prots. It's just like working new radio and there is some Distractor if you allocate some memory This function should be used to free it and some meta information for logging And here is a simple example of how to create your own block This is File input and What is block is intended for is it just read some data from file and nothing else and then first first thing you need to do is to just allocate memory for processing keytem item, sorry and Define the type of this block. It's a source block and You can put some name You don't have input lens because it's again, it's source block and you have some Output data land and you just specify two callbacks Which the first one is like work function of in new region the last is Distractor and here are some example of how these Prots function can look like You just read some data from file and put it into the output buffer and that's it It's pretty easy and I recommend everyone to try to run or write your own block for JPK and It's really easy to use this library because we have not that much headers for our processing quiz and blocks you have Separate the header file for format. I'm sorry here should be codecs For codecs, you can use this header and all symbols have some prefix was my JPK Simple processing key to handle your questions I have a question like why not using g streamer or ffmpeg or I mean I'm not sure they actually support all codecs of just but then again if we're implementing them Why not implementing them there and just It's just a simple project and Ffmpeg is too complex to just to do some basic audio in coding. It's my opinion No, I mean one of the fundamental differences I think to a lot of the codecs that you find out there is for example that We have not fixed size of Codec frames, so especially if you think of something like AMR I'm not sure. I mean at least that was I think that's that's that's the difference, but For sure something that can be looked at CBR is not the only way to do it and like it's just many VBR Formats out there and fmpeg perfectly works is that I mean the reason I didn't use these tumors because if you look at the original source code of gap Like 95% of the source code is like completely correct specific and the actual stuff I would have needed to implement and the framework part. It's like 100 line and I wasn't gonna introduce the dependency into just trimmer and Jlib and All the initialization code that I need to set up a G streamer flow graph just to literally take you know the 26 byte or I don't remember how much and call one function and And store the odd putter. I mean originally that too was mostly because I was dealing with the You know the format that is generated by the the Rackle test set and The DSP API of the thing where most of the stuff is not even correct stuff It's reordering the bits because they managed to put all the parameters in you know, 50 different fashion Basically every possible combination Somebody used it Just seemed overkill I think another advantage of this exactly why not say if I'm back to see it's much easier to embed this I think Probably we have For sure you can compile it wherever in any way you want you can only compile some codex That you need exactly Yeah, if we have some time We have about five minutes, I think and I'm going to try to show some real example of usage Jpk in Osmocon and Osmocon bb for example Yeah Can I also ask What was the status with Elsa and RTP streams like this would it be possible to capture RTP stream off the wire and play it through Also, this is exactly what I'm going to show you You can easily can try some simple live demo. I think Oh Jpk we have more or less Complete help for that and for example, you can capture capture frames from From also Default Just need to specify which device are you going to use for capturing you need to specify the input format and in this case It's a roll frame format and it was very yeah roll PCM Then you need to Specify which codec are you going to use and which format to be more correct For example, if we would like to use, I don't know JSON for it or AMR Yeah To write this to some kind of output file in temporary directory like Hey This Yeah, it's just starting and we can try to send something say something here and It should capture and Now we can use the same tool to play these captured stream If I didn't break anything it should work You need to input file It's input format which we used used here and format we are going to convert it to roll PCM back and Play to also sync also Yeah default playback device Simple example Yeah, I hear myself here. Yeah, and here we have some basic integration of how do we demonstrate is that the congress? We are only able to normally decode the downlink frames But not happening because it's my bet because I initialized the JAPK library at the beginning of execution of mobile application and I should do this when Exactly when we start call and we finish call, but it was like a hack. So And it sure is it? Yeah Visible Yeah, I will show this so what's the idea then to use it with Osmo-com BV or Like do you have something in mind or is already working there or yeah, I can even try to do money to show this in in action So here my body miss preparing So an Osmo-com BV originally you could make voice calls, but the audio would actually be on the phone right no No, they we have MNCC in Osmo-com BV exactly. That's what how Jolly did it So the traditional way of using voice in Osmo-com BV is then that you Have the the voice on the MNCC interface and you can attach LCR to it or something like that And this is exactly why I use Osmo need be because we have LCR they're working at best Until we require support And yeah, also here you can see the fake T-Rex in action, so we just start Base station and we have our own virtual network We need to switch to mobile Start T-Rex call and you start mobile application Yum you for example, I don't know Let us to show this Again, we don't actually transmit anything Here's our environment We put some virtual SIM cards with the money Then we can try to test This is exactly what I demonstrated at the congress I'm correct how it works together with Osmo-com BV Nothing special. We have some problem with uplink For example, there is another number 993 And The problem is that we hear sounds from the past when we started mobile It already started to record the frames and put them to buffer and we read them to probably sound of my keyboard and Yes from the past crazy and yeah, that's that's all I We still have some and we Do some source code, but I'm not sure we have Yeah, if you have any questions, as usually feel free to ask