 So we continue with the next talk What's new in G streamers? So a very generic talk on on what's going on there by Tim Philip Miller? Hi everybody Thanks for coming Let's talk about G streamer. I've got way too many slides. So I'll you know be very quick. It's gonna be a very high-level talk and There'll be more talk. There's another talk about a year later and and Then another genius talk afterwards quick introduction who am I? I've been hacking on G streamer for a while. I oh, there's a slide missing Anyway, I'm Tim Miller. I'm one of the main G streamer maintainers and developers I've been doing this for 10 years and I work for Centricula To help customers with G streamer What is G streamer just in case you're coming in to to have a peek Basically, it's a general framework for multimedia processing. We're trying to provide building blocks that you can combine You know for reading for streaming downloading encoding decoding Muxing RTP payloading etc. And you can combine these freely to do whatever you want with your media flows basically We aim to be cross-platform. So, you know Windows Android iOS Linux of course embedded systems. We want to we are toolkit agnostics. We'll use G lip But you know, we we also use Qt and have integration for everything you want really We don't tie you down for anything and we want to support all in any use cases from editing to streaming to playback to recording And it's a set of libraries that we provide and plugins and then you can write stuff on top We provide a very abstract and very flexible API that allows you to do hopefully everything and We don't reinvent everything from scratch, but we We built on other libraries and other components, of course We have low-level API to give you full control We've got some high-level convenience API for things like playback and coding and video editing and stuff like that We have an RTSP server library that lets you easily stream stuff over RTSP All these kind of stuff and we don't try to make unreasonable assumptions. So you can use it everywhere You can integrate in your browser. You can use it in OpenGL applications, you know, whatever you want You should be able to use G streamer with it All right, so what have we been up to releases the last release has been some time it was 1.12 in May that's our current stable release. We were aiming for a six-month cycle But I decided to delay the upcoming one for a little bit. So 1.14 is gonna come out really soon now Basically, we're gonna probably make a pre-release us next week And hopefully later this month or early next month. We'll have 1.14 out That's our next stable feature release basically So what has landed what's interesting possibly we have video conversion and video scaling can now is now multi-threaded So if you've got high resolution content, you can just spread it over your course very nicely Timed text markup language TTML and there's a new plug-in for that. This is quite nice. It's a new standard And it has the potential to to describe text Subtitles in general and text markup. So I'm quite excited about that. So we have a plug-in for that It's not enabled for auto plugging it. You have to set an environment variable But yeah, and it supports the basic profile at least Splitmark sync splitmark sync is a Something that fragments your media stream so you can say well, you know split my create a new file every So so many megabytes or gigabytes or so many seconds minutes hours, whatever and it just takes your Encoded streams and it was split it and it works with any container even those you know that don't support that natively so you can just decide metroska and before whatever you want And I'll just work so that that has been rewritten to be more Deterministic now so And it should be more stable and there's a new format location full signal Which allows you to get the first buffer of a new fragment when it starts so you can then read the metadata on it and get You know timestamps and any other information from it to you know if you want to have special sign names will do something In response to that Dash dash trick mode playback that also landed. That's that's quite quite some work It's not entirely trivial because you need to you know stay within the envelope of the bandwidth I mean, you know, you don't have internet download bandwidth So you need to skip keyframes you to find out where the keyframes are you need to skip segments you need to You need to sort of you know figure out how How can you utilize full bandwidth but not more and still squeeze as many frames out of it as possible? So that landed That works quite nicely with certain dash streams We have loads of new features and performance improvements on embedded which I'm not gonna talk about Your video for Linux or max DMA bar zero copy all that stuff Olivier has a talk right after this one and he's gonna tell you everything about it. So I'm just gonna skip all that Hardware accelerated video encoding and decoding we have lots of that of course We have a new MS DK plugin for Intel's media SDK which provides Yeah, video encoding and decoding on Intel hardware That works on Linux as well as on Windows There's DT Mavapi, which exists already That is based on an open source stack, but that only works on Linux And that also has seen loads of new features and fixes and the encoders are now auto plugged So they were quite nicely. We have a new NV deck plugin, which is for the Nvidia Graphics stack basically and we already had an encoder for that which has a new few new features Yeah, what else what's coming up Something that just landed is the aOMedia AV1 support AV1 is basically the next generation video codec hopefully better than hr65 and It's gonna be royalty-free as an open standard and Tim terribre is gonna talk about it at 5 p.m. Later today, but I'm really excited about that It's basically, you know once once we have that and it's gonna be widely deployed It's gonna be widely supported Apple just joined the foundation. So aOMedia So It's just gonna work. So at that point we will have Cutting edge audio codec opus and a video codec. Hopefully so that's gonna be great We can ditch all the impact nonsense The codec is still experimental. I think the bit stream might be either just has been stabilized or might be about to be stabilized Go to Tim's talk. The encoding is still very very slow, but we have the integration so you can start playing with it if you like There's a new plug-in called IPC pipeline which allows you basically to split G streamer pipelines over multiple processes and again Olivier is gonna talk about all that in his talk We have something called a ring buffer for debug logs the thing is we I mean we have we log like if you enable debugging in G streamer We log so much stuff I mean you can easily accumulate hundreds of megabytes and gigabytes of debug logs But you know sometimes people have problems they like well, you know after three days of streaming I run into this error and you then you can't just really Make a debug log. So we now have a ring buff of the debug logs I mean you find a problem you can just grab the last you know megabytes out of it That's quite nice. It's really simple to do, but you know, no one has actually done it so We have a tracing framework and that has seen quite a few improvements and we have a leak trace on particular That I mean that works on embedded systems as well So it doesn't have the you know that grind is nice on the desktop, but if you have something it has Much less overhead. That's much nicer. So the trace so they can do stack traces, of course We can do snap-shotting now Figure out better where actually latency in your pipeline comes from without, you know, digging through the debug logs in too much detail HLSing2 we had HLSing you feed it an impact to your stream, which is not always convenient HLSing2 basically you feed it elementary stream So you give it an encoded video stream and encoded audio stream and it will do the splitting and maxing for you It'll use split maxing internally But it will work much nicer with content that is already encoded HLSing kind of relies on an encoder upfront So it can force keyframes at the boundaries and HLSing will work without that. So that's nicer That's the use case you need RTSP We have an RTSP server library and a client of course so you can easily stream, you know streams of RTSP With very little effort and it's used in security cameras and whatnot RTSP2 support has just landed and I believe we might be the first one to implement that And that's also on this audio back channel, which is a horrible extension to the standard To allow you to basically send audio back over a playback RTSP connection That's coming up soon In general we have a mission to we have we have our plugins split into multiple modules and Well, they're called base good ugly bad and People don't like you know bad. They see juicy plugins bad it's kind of an inside joke, but they're kind of worried and We're not really putting enough effort into moving things from bad into good or base So we're kind of trying to change that and consolidating that and usually we add new stuff in bad until the API is stable And we're you know, we like it and then we move it over But we just don't you haven't been good enough about that So we're making an effort to move more stuff Into our core modules So the good thing is mp3 patterns have expired which means we can move mp3 decoders and encoders and mp2 encoders Well, we can move that into good and we've done that which is nice AC3 patterns have also expired unfortunately. We can't move the Decoder because it's gpl license and we don't do gpl in our core In our good modules, basically What else we have we have a new bunch of sort of mixers audio mixer and compositor for video It's based on a new aggregator base class and what it does is it handles live streams? Properly so you can actually have you know defined latency For the mixer so basically if one of your input streams drops out because someone put us in a network cable You still want your pipeline to not jam up but continue running, right? So we made a new base class for that the old base class didn't handle that really nicely So we have made a new base class for that that works quite nicely We moved that into core now and we will hopefully move the audio specific one as well And then hopefully the video one soon after and we can start porting muxes to the new base class FLB mux has already been ported And the other ones as well as at that point You know you will have a much nicer experience for for making live pipelines that don't jam up And our open GL integration library and plugins has moved into base as well It wasn't bad. So now we can build on that and the API is stable now. So really we're quite happy with that as well Then web RTC Web RTC is very nice if you don't know what it is think of it as Skype in your browser How do I stream stuff to my browser and you know the answer is always well? It's not so easy, right? I mean you have it depends what operating systems it depends what browsers what browser versions you want to support what You know codex you can support and it's it's a mess really I mean you might get away with sort of you know dash and HLS in most cases, but it's yeah It's not so easy, but web RTC. I mean, you know, of course it has different advantages and disadvantages Just you know adapter streaming HLS dashes made to be scalable, but But still I mean web RTC. I think it's gonna be big because it's gonna work everywhere It's gonna work in most recent browser sooner or later So we basically have now Yesterday we merged the G streamer web RTC plug-in and library into gc plugins that Just landed it uses live nice for I see you stuff to get you know through firewalls and that etc And if you're interested in that Nibik has just written a blog post about it and we also have a Demo repository, which might be a little bit bit rotten, but it should work That's really nice because It will I mean it basically allows you to easily leverage Web RTC and stream to web RTC clients using G streamer Anything G streamer and you can leverage all of G streamer. There were existing efforts One was open web RTC sponsored by Ericsson. That's kind of dead now and the reason that wasn't really continued is that the Let's say there was a mismatch from what we as library developers see developers need from a API was just you know easier to do a new one Quirento also had has something about that, but it's more like media server focused. It's a very rich framework I Don't know if that is much developed. I might it might just be picking up again And there were some proprietary solutions, but we open source guys So we want our stuff open source and we really want it in G streamer Then you have live web RTC from Google of course That's like the the thing everyone uses more or less But I don't know if anyone has used it, but it's horrible. It's really horrible. I mean this is so painful I mean it works, but it's very limited if you want to do advanced stuff you have to fork it I mean just building it seems to be a problem I Don't know anyway, so I mean you know web RTC been our new G streamer thing It's very flexible. You've got full control It more or less maps the existing APIs, so I mean, you know, do you have to learn Something new in that respect the nice thing is you can leverage all the existing stuff in G streamer hardware encoders Decoders your copy captor rendering you can make all that work So it will work on embedded immediately you can feed peaks pre-encoded contents, and you don't have to fork with web RTC and maintain it and update it I mean, it's not fully fun. I mean, you know, it's not super complete, but it's it's used in production So it works for what works it works well But you know it doesn't do everything yet, so renegotiation isn't fully supported receive only streams Don't work fully yet But you know the internals map the spec really more or less So you can easily see where the gaps are and just fill them in so you know if you want to help with that Help us want it We have performance optimizations more or less everywhere There's so much stuff in the pipeline, but you know, it doesn't really seem worth talking about detail for the embedded parts the Olivier's talk and SRT secure libel transport is a new thing very much hyped and marketed, but it's also very nice. So There seems to be much support for it Industry-wide so it seems well-placed to replace RTMP and we've just merged source and sync plugins for that so you can stream if you like Mason our current build system with all the tools, but we're moving to Mason Mason is basically why it takes the best parts of C make and Then you know improves upon that We didn't like C make so It's got a very nice maintainable description language. It's not touring complete I mean, it's you know, it still has a few things missing But for us the main motivation was also that we have a Microsoft Visual Studio build which works Yeah, but there's still some work to be done we're gonna switch to Mason fully and drop auditions, but it needs to be ready We need to make it work everywhere rust rust is awesome. It's a new programming language Originally from azilla It's basically C C plus plus you always wanted the nice thing is it matches, but you know, it's it's safe It's a basically system programming language, but it's safe is productive And it's more or less as fast as C and C plus plus zero cost abstractions. It's awesome We're not gonna pull this humor to rust anytime soon But you know, we're playing with it. We're looking at it Matches our memory model ownership, etc. Really there in excellent shape. They can be used in Production Sebastian has a talk tomorrow at 11 We should check it out Our G streamer see she see sharp bindings have also been rejuvenated and they should be up to date now What can we improve upon? Well, the usual things of course But one thing that's sort of a pet peeve of mine is I I want to you know I get depth of streaming out the client side is really well supported But the production side isn't as nicely supported. I mean we have HLS thing, but you know, it's not really that nice to use So I mean RTSP server has been a massive success for us because it's so easy to use and it's so powerful And it would be really nice if we had something similar for dash and HLS Yeah, so in general writing simple servers should be easier Just like a sync element in the same for an HTTP server element, you know People like making little pipelines and just running GC launch and just you know serving their webcam And you know, they don't want to necessarily write some code They you know, sometimes they just want to use the library for a little use case And just you know make a pipeline use a plug-in and that's all they need Yeah, that's all I have. Thank you very much and thanks for the organizers, of course And if you have any questions or comments Yes So I can pass the mic if you have any questions Don't be shy No questions. Excellent. Oh Not really about what's new now, but any news of the SIP support in just remember session initiation protocol Well, I mean you you can use it I think but I mean There are a lot of libraries So there's a library called far stream Which is a G streamer element that basically implements the media part that you need to support SIP that we Developed almost a decade ago now and that has been in the news and production. It's using pigeon. It's using empathy and But it doesn't do the actual like SIP protocol part that you need a different library. There's a bunch of them They're all terrible Because SIP is terrible Any other question One in the back. Oh one in the back. I didn't see you you in the dark Thanks for your talk Last year I was looking in how I could query the dimensions of the screen in G streamer to get them back in the pipeline and Didn't manage couldn't find it. What's the best way to get support? To find support on the web You mean get G streamer support. Yeah, like technical questions, right? Well, which brings me to my last slide. So the best I mean the best way actually is if you find us on on IRC We are in the hash G streamer channel on the free node network. That's the best way to just you know Get questions answered quickly. Most developers will be there during, you know, European daytime North American daytime But we have some Australian people as well, but those are the busiest times That's the easiest way also the G streamer development mailing list, but it's fairly high volume and Yeah, I mean people sort of answer when they have time, but skip it otherwise. So IRC is the best in general follow us on Twitter We also, you know, I mean, it's it's not high traffic. So but blog posts, etc That's where you find them and we have a hackfest coming up in spring and the conference at the end of the year in Edinburgh probably They just be confirmed anyway, thank you very much and Olivier is up next with G streamer for embedded