 It's because I used it used this in Osmokon BB project and there was some discussions about Transcoding in media gateways, so probably it would help someone. I hope yes So what is the JAPK many people here asking me about this what is Four letters and it stands for JSM audio packet knife. It was initially written by Sylvain Okay, I need to finish and It is it can it supports formats in coding and the coding for example you can I don't know use a mirror or JSM fails in different formats. It supports classical JSM codecs like full-rate half-rate or AMR Some some work was done by Harald to make it support also So you can use your audio subsystem on your PC to play and capture sound and Also some work was done by Harald to support RTP streams, but it's unfinished. I think and The use cases for of these two is for example transcoding one audio format to another It's a I think it's kind of ffm pack but for JSM codecs and you can yes perform transcoding between for example full full-rate audio into AMR you can Listen for real-time RTP streams and do any transfer if you want What I would like to say About the previous state of the project is that it was just a simple command line tool and There was no testing coverage. We only had some samples which We And what we expect after conversation and that's all we had and We had no shared the libraries and it was a single binary and it was Not that useful for me personally. So here you can see the example how to for example convert one file to into another and Why not to turn this into a shared library? It's I think it's a good idea and Audio coding support you can for example use this library for audio coding in osma.com Bb and mobile application you can we are currently doing some work of implementing jr. JSM blocks for a new radio to support Audio coding for JSM and it could be also used for Osma media gateway to Transcore streams Probably someone else would like to use this What was what was done by by me is I created a new library called Osma jpk We just Some some work to put headers in common for Osma com Projects location. There is a separate directory Osmo com jpk All symbols all exported symbols Have Osmo jpk prefix now It's requirement. We have package config manifest so you can easily compile your programs together link against this library and Yeah, we used and now use a TALAC subsystem for managing memory management and We have some little bit better testing coverage, but not not full yet And what was changed is that jpk was renamed to Osma jpk because many programs Have this prefix and it is linked against Libosma jpk Now we use Osmo com login framework instead of simple print of calls and Some basics about the architecture of Osmo jpk it is It looks like a new radio flow graphs So you just need to create a processing chain. It's called qui in terms of jpk and They are Not that complicated like you can create in new radio, but you have only source blocks Processing blocks and sync blocks and that's it. There is no scheduler You just need to iterate through the chain yourself There is some IP for that and you can allocate a few processing chains You want to work with them somehow And yeah Here is some example how to create your own block because for Osmo com BB I have some little experience of creating my own block and I will show some simple example So all you need to Describe is how much data are you going to receive from different blocks in case of source? It's zero and how much data are you going to pass forward to another to another block and We have some internal state if you want to store something for example file descriptors or Sockets in case of RTP There is output buffer where should we write data? We just finish which we protest and there are two or even three. I don't remember exactly callbacks. So one is Prots. It's just like working new radio and there is some Distractor if you allocate some memory This function should be used to free it and some meta information for logging and here is a simple example of how to Create your own block This is File input and What is block is intended for is it just reads some data from file and nothing else and then first first thing you need to do is to just allocate memory for processing heat em item sorry and Define the type of this block. It's a source block and you can put some name You don't have input lens because it's again its source block and you have some Output data land and you just specify two callbacks Which the first one is like work function of in new radio and the last is Distractor and here are some example of how these Prots function can look like You just read some data from file and put it into the output buffer and that's it It's pretty easy and I recommend everyone to try to run or write your own block for JPK and It's a really easy to use this library because we have not that much headers for our processing quiz and blocks you have Separate a header file for format. I'm sorry here should be codecs For codecs, you can use this header and all symbols have some prefix was my JPK Simple processing queues to handle your questions I have a question like why not using G streamer or FM pack or I mean I'm not sure they actually support all codecs of just but then again if we're implementing them Why not implementing them there and just It's just a simple project and If FM pack is too complex to just to do some basic audio in coding. It's my opinion No, I mean one of the fundamental differences. I think to a lot of the codecs that you find out there is for example that We have not fixed size of Codic frames, so especially if you think of something like AMR I'm not sure. I mean at least that was I think that's that's that's the difference, but For sure something that can be looked at CBR is not the only way to do it and like it's just many VBR Formats out there and FM pack perfectly works is that I mean the reason I didn't use these tumors because if you look at the original source code of gap Like 95% of the source code is like completely correct specific and the actual stuff I would have needed to implement and the framework part is like 100 line and I wasn't gonna introduce the dependency into just trimmer and Jlib and all the initialization code that I need to set up a G streamer flow graph just to Literally take you know the 26 byte or I don't remember how much and call one function and And so the odd putter. I mean originally that too was mostly because I was dealing with the You know the format that is generated by the the Rackal test set and the DSP API of the thing where most of the stuff is not even correct stuff It's reordering the bits because they managed to put all the parameters in you know, 50 different fashion basically every possible combination Somebody used it Just seemed overkill I think another advantage of these exactly why not say if I'm back to see it's much easier to embed this I think For sure you can compile it wherever in any way you want you can only compile some products That you need exactly Yeah, if you have some time We have about five minutes, I think and I'm going to try to show some real example of usage JPK in Osmocon and Osmocon BB for example Yeah Can I also ask What was the status with Elsa and RTP streams like this would it be possible to capture RTP stream off the wire and play it through also? So this is exactly what I'm going to show you You can easily can try some simple live demo. I think Oh APK we have more or less Complete help for that and for example, you can capture capture frames from From also I think Default Just need to specify which device are you going to use for capturing you need to specify the input format and in this case It's a roll frame format. It was very yeah roll PCM Then you need to Specify which codec I am going to use and which format to be more correct For example, if we would like to use, I don't know JSON for it or AMR To write this to some kind of output file in temporary directory like a Is Yeah, it's just started and we can try to send something say something here and it should capture and Now we can use the same tool to play these captured stream If I didn't break anything it should work You need to input file And it's input format which we used used here and the format we are going to convert it to roll PCM back and Play to also sync also Yeah default playback device Simple example Yeah, I hear myself here. Yeah, and here we have some basic integration of audio We demonstrated that the congress is we are only able to normally decode Downlink frames but not uplink because it's my bet because I initialized the JAPK library At the beginning of execution of a mobile application and I should do this when Exactly when we start call and we finish call, but it was like a hack. So I'm not sure is it Yeah Somehow visible Yeah, I will show this So what's the idea then to use it with Osmo-com bb or like do you have something in mind or is already working there or yeah I can even try to do my to show this in in action So while here my body miss preparing So an Osmo-com bb originally you could make voice calls, but the audio would actually be on the phone right no No, they we have MNCC in Osmo-com bb. Exactly. That's what how Jolly did it So the traditional way of using voice in Osmo-com bb is then that you Have the the voice on the MNCC interface and you can attach LCR to it or something like that And this is exactly why I use Osmo need be because we have LCR working Until we Repair support and yeah, also here you can see the fake T-Rex in action. So we just start Base station and we have our own virtual network And we need to switch to mobile Start T-Rex call and you start mobile application Yum you for example, I don't know Let us to show this Again, we don't actually transmit anything Here's our environment We put some virtual SIM cards with Germany and then we can try to test This is exactly what I demonstrated at the Congress Incorrect how it works together with Osmo-com bb Nothing special. We have some problem with uplink for example, there is another number 993 and The problem is that we hear sounds from the past when we started mobile It already started to record the frames and put them to buffer and we read them to probably sound of my keyboard and Yes from the past crazy And yeah, that's that's all I We still have some and we Do some source code but I'm not sure we have Yeah, if you have any questions as usually feel free to ask