 Alright, so I'm going to do my best here. Again, I apologize for the delay. I'm having a couple of technical challenges, but what I'll do is I'll just do my best with what I have and hopefully you can see it. My name is Matthew Fredrickson and I am the open source project lead for the Astros project. I'm really, really glad to be here. This is the first time I've spoken at Fozdom. Some of my predecessors have done that, or I've been here and talked a little bit about Astros, but this is the first time I've done it. I just want to start off and ask a quick question. How many of you have used Astros before by raise a hand? That's good. That's good. I was thinking about my talk and how I wanted to do it and I wasn't sure how to target towards my audience and who they're going to be. Alright, so that's comforting to hear. So most of you are probably familiar with what Astros is in the background and things like that. I thought that that was probably a pretty safe assumption and so I'm not going to go into a lot of details on that, but mostly what I'd like to do is kind of cover a few things about what's going on with the Astros project. So just as some background, I started doing this a couple of years ago. My name is Matthew Fredrickson and Matt Jordan, for those of you who remember him, was my predecessor who led the Astros project and he got the opportunity to move around within Digium, the company that sponsors Astros and to kind of move into a more oversight role within the company as a CTO. And with that, they needed somebody to step in and to be able to take care of the day-to-day management and run you the Astros project and they asked me to do that. I've worked at Digium since about 2001 in various developmental capacities doing lots of different things. I've worked on Digium's hardware. For those of you that go really far back and use asterisk in a hardware environment with old ICDN cards or analog cards and things like that, I worked a lot on the hardware, specifically the device drivers and design of that hardware as well. Through the years, I maintain a lot of the telephony interfaces within Astros, the traditional telephony interfaces like the DOTI channel driver or the Zaptel channel driver. I maintain the Zaptel or the DOTI kernel device drivers and things like that and extended those and improve those and things of that nature. I also maintain LibPRI, which is the ISDN stack for asterisk. For those of you that remember what ISDN is, I guess ISDN is kind of going away in a lot of places. And then at some point, a number of years ago, I realized that there was no SS7 or C7 stack for asterisk. So you couldn't talk to really big phone switches and things like that directly. And I thought that that was a shame. You know, I really had a strong passion in my heart for old dead telephony protocols. And I felt that Astros would not be complete without being able to talk directly to an SS7 switch or a C7 switch. And so I decided to start working on an SS7 stack for Astros, which was called LibS7. I got pulled out of dead telephony land for probably about five or six years ago now to go work on some new world WebRTC related initiatives as the technology started to percolate and to become more interesting. And so Digium wanted to build some tools and infrastructure related to WebRTC and I got to be able to do that. And that was really a lot of fun, kind of turning my face a little bit more forward. And after that they asked me to manage the Astros project and that was about two years ago. And so it's been an honor and it's been very exciting to be able to work with all the really smart people, not just at Digium, but all over the world who consider to be valuable to themselves and to those around them to donate other time to work on that project and extend it, things like that. So I'll talk a little bit more today about a little bit of history. If you'll remember, actually let me ask a question, how many of you used Astros when it was in its 1.0 version? Oh man, that's more hands on expected actually, that's really good. All right, so it was a really long time ago. You know, that was probably early 2000s, it was quite a few years ago. It actually was, it was surprisingly feature, at least spec wise complete, right? It had like ISDN support, it had, which back then that was very important. It had H223 support, it had MGCP support, it had a SIP channel driver. It had some programming interfaces that we still use today like the AGI and ARI interfaces for doing application based programming with Astros. So it was a surprisingly well rounded version of Astros, right? And you could look at it and say, well, we could just stop right there, but obviously progress doesn't stop. And so we moved forward and did a lot of things to it. And eventually the 1.6 version came out, which included not just support for old school 8 kilohertz, you know, narrow band telephony, but we decided that we wanted to extend the core of Astros to support wide band telephony. So your call sound a lot better and you can hear the difference between different, you know, consonant sounds and things like that better and the quality is better. Obviously, since I implemented SS7 support, I feel a shameless obligation to plug that that was the version that we merged in SS7 support into Astros. And then some more time passed and we had many more versions and things happen. These are just some highlights, but Astros 11 came out and it was, it had the beginnings of WebRTC support, which is kind of the next big thing in, in I would say that the real time communication space in the last few years. And so we started to implement that in the SIP channel driver to be able to extend the SIP channel driver to interoperate with browsers and things like that, that's where WebRTC. Astros 12, we decided that the SIP channel driver that we've been using for like 13 years or whatever it had been had gotten the point where it was very difficult to work on. It got to the point where we, where we tried to make changes, we tried to fix a bug or something like that. And it was, it was built in such a way and it had, had not been redesigned in such a way that it was easier to work on that we, you know, if you tried to fix a bug, you ended up creating like three or four more. That was very, it was, it was the point where we were not happy working on it and we decided to take a step back and to write a new SIP channel driver that was better architected and better suited for extendability and for changing and for bug fixing and it had a better, from the get go, a better tested implementation. And so we did that and that's where the PJ SIP channel driver came from. It's based on the PJ Project SIP stack and you know, it's worked very well. We, in Astros 13, we added a new interface called the REST, ARI REST interface because REST interfaces are very popular. They make a lot of sense and they're, you know, from a program perspective and that's kind of where the industry was going. So we built a REST interface for Asterisk. We did some more work on the PJ SIP channel driver as well and Astros 14, which was the previously released version of Asterisk, included more ARI improvements, so more fleshing out of APIs and capabilities that you could do with the REST interface in controlling Asterisk. More PJ SIP improvements and we moved asynchronous DNS into the core of Asterisk, which is, which is a good thing for a lot of people. So Asterisk 15, that's what I want to talk about. That was the most recently released version of Asterisk, which was in October at Astrocon. So we kind of looked at kind of where things were going and what Asterisk had been doing. And Asterisk lives in this land which is, which merges old and new worlds, right? That's kind of where it gave its birth, is you could connect to really old telephony protocols and you could merge them with new world telephony protocols. Like, you know, in early 2000s that was maybe talking ICDN or something like that and then being able to connect it to a SIP call, or even an H323 call, which, you know, now seems like dinosaur technology. But, you know, that's kind of where Asterisk has lived and we kind of realized that there was a space that we needed to fill in terms of bridging these old and new worlds again with Asterisk and video. And Asterisk actually has had video support for a very long time. It just, it's been limited and there have been some areas where, you know, given how the market has changed and the requirements of people for applications have changed, we decided that there need to be some changes within the core of Asterisk as well to be able to keep up with that market. And so, you know, traditionally you'll have this kind of single video experience. I'm going to try to be quick because I'm a little bit late. But this, it's called the MCU experience where you have, you know, a central mixing box for video conferencing. And you have a box and what it does, it takes everybody's video streams and then it decompresses them and then it mixes them all together into square or something like that. And then it'll re-encode it and send it back out, right? And that, you know, seems like it worked for a very long time and actually it still works for some use cases. Or some cases where that's a better environment or a better situation to do. But, you know, with all the really cool browsers that came out, you know, the browser technology with WebRTC and things like that, they brought this really awesome new client that you could do things that really traditionally you had like proprietary client and, you know, somewhat proprietary client, somewhat proprietary mixer or full proprietary, right? And the browser kind of brought this really programmable open client that you could do really cool things that weren't really possible before. And so, you know, the MCU experience works, but it doesn't allow you to build really rich client experiences in the browser, right? Because that video experience is fixed for the most part. And so we realized that, you know, this is an area we wanted to improve upon and do some things better. What we did with Astros 15 is we started to remove some of the limitations that kept Astros where it was. Previously in an Astros video conference, it was a video switching experience. So you had multiple video streams coming in from different participants and it would switch between whoever the active talker was. And everybody in the conference would receive a single stream and it would be a switching experience. And so we kind of, we took some of those limitations out. We decided that was the limitation we wanted to get rid of to be able to support some of these new conferencing experiences. We allowed renegotiation of video streams and things like that in the core of Astros and to be able to, you know, dynamically add and remove streams on a call session and things like that. That was kind of a requirement. And then we added some support for, in the conferencing application, to not just do video switching, but to be able to do simultaneous showing of all video streams at the same time. So everybody in the conference was allowed to receive everybody else's stream at the same time, as long as the client permitted that. Much like what Janice and some of the other video projects do as well. And so that experience is called a selective forwarding unit experience. And if you see my diagram, kind of the way it works, so if you see each of the browsers, they're sending one video stream back to asterisk, and then each one's receiving two more back, and that's the other participants. And so you can imagine that that is a really, you can do some really neat things with that. So if you're a client and you're watching a presentation, let's say one of those, and you have two streams coming in, and one's the presentation, the head, the person, the head of the person talking, and one's say the slide feed from the presentation. As a client, let's say you as an individual are watching the conference, and you like to take notes with your hands, and you're watching that slide feed. And in a fixed experience where the client didn't have access to both those streams, that if the presenter decided, okay, I want to swap back to my head, instead of I'm done with this slide, everybody else better be done with it. I want to swap the feed back to my head, that's kind of what you get. But with an SFU type experience, you have access to both streams, and you as a participant can say, no, I want to hold on to the slide stream longer, so I can finish writing my notes and things like that. So you have the ability to make a lot more powerful from a client perspective experiences with the SFU environment, which is really compelling. So anyway, that's kind of what we did with asterisk is we took that approach and we tried to make asterisk into a video selective forwarding unit. So let's see, I'm going to talk a little bit more about some other things in asterisk 15, but if you'd like to play around the SFU support and things like that, they're on the asterisk GitHub repo, we have a sample client, a browser-based client that you can download, and you can hook it up to asterisk and things like that and see, you can connect your friends to it and everybody can see each other and wave at each other at the same time and things like that. It's really fun to see. I'm going to talk just a couple of minutes about some of the general improvements in asterisk 15. Okay, I got to get moving quicker. All right, so this is just going to have to be really quick, but we had a number of platform improvements in asterisk 15, improvements for GCC7, build fixes for free BSD, build fixes for the herd, which was pretty cool to see because I hadn't done anything personally with the herd in many years. There's a limbic support for MSSQL, which is our database migration layer, a limbic that we use, and then the PGA project for those of you that build asterisk currently, it's downloaded and built by default as part of the build process, so you don't have to go and find it and things like that. We also have some new sounds for asterisk. We added the 2.0 OAuth protocol for XMP motif, the channel driver there. We added, we merged in some patches that were contributed by a gentleman called, named Dennis Goussay, and there's another gentleman, I think, in Berlin that also worked on that, but he had to do stereo support in the Confepridge application. So if you have an endpoint that supports stereo audio, you can listen and you can have people positionally be in different places in the conference and you can hear it being in different positions, it's kind of neat. There's some more debug utilities. We added support for some more WebRTC technologies that were necessary as part of our SFU support, like RTCP Max, VP9 codec pass through, you know, some convenience options for setting up WebRTC and things like that. We also added support for a technology that we've been really holding out on with asterisk which is called Bundle, which is a technology used in the browser to multiplex audio and video streams over the same socket and the same RTP session. And that was something that we needed to do to do a lot of the video things. So just as a quick heads up for Astra 16, you may be wondering what's coming out. I apologize, it's so brief, but I think my time got eaten up by technology stuff, but some of the goals for Astra 16 are we're going to continue to flesh out Astra's SFU APIs. So from an ARREST API perspective, being able to control Astra's better with, you know, REST and things like that and be able to manipulate and change the, mutate the conference experience that you're having. We're planning on improving Astra's video resiliency in poor network environment, so we discovered that after we did the SFU support that there were some technologies that made the experience a lot better when there was high packet loss and bad Wi-Fi networks and things like that, so we're working on some of those. And then we're going to be continuing to improve the PJ SIP performance. And so I think that's pretty much it. As far as my time goes, I don't want to run over and take much time. There was some really, is Alexander Trout here? Did he come this morning? Okay, there are a few contributors that contribute the Astra's. I always try to give props and shout out to them, but it looks like maybe Astra's is a huge project. Lots of people contribute to it. There's a lot of statistics and things like that. I can't understate how big it is and how much it is to, how many people are involved and things like that, but I always try to recognize them. So just as a reminder, if you're running Astra's 11, it's time to get off of it. It's end of life, like as of last year, like in October, November. That means it doesn't receive security fixes anymore. So if you're running it on a public network, you don't want to do that, right? So Astra's 15 was released at the same time. So get forward on to either Astra's 13, which is the current LTS, or 15, which is the current, you know, most recently released version. Just as a heads up, typically the odd versions of Astra's releases become LTS. Astra's 15 is an exception. We changed it because all the video work we did, we decided to wait to make Astra's, make it into an Astra's LTS release until Astra's 16 instead. And so this presumably next, this year in October, Astra 16 will be LTS. Anyway, thank you very much. Sorry to take so much of your time. If you have any questions, feel free to come afterwards and ask me because I don't want to take too much of the next speaker's time.