 Hello everyone and welcome to Arch Linux Conference 2020. All around the world we are facing challenging times. Like all this isn't enough, we are in the middle of a global pandemic as well. Today I would like to take the time and remind every one of us that we should focus on the good, not the bad. We shouldn't fight against something because we hate it. We should fight for something because we love it. For example, the pandemic forced us to re-evaluate Arch Linux Conference and ultimately led us to this online format, which is pretty awesome. I would like to express my appreciation and gratitude to all our speakers, to everyone behind the curtain who made this conference possible, to our team members and district contributors, as well as our awesome community. And of course, thank you for joining us here today. I hope you enjoy our talks, the first one of which we'll start off with is Arch Linux, the past, the present and the future. Hi everybody, this is Judd. Happy Conference Day. I'm very happy to see that Arch has grown into something bigger than any one of us. You know, when I first started it many, many moons ago, it was always just a scratch my own itch. And it's one of the most joyous things that have ever happened to me, that the community has grown above and beyond any of my best expectations. I'm sorry I couldn't be there with you guys today, but I'm very happy it's happening. So all my best wishes and enjoy the conference. Hi, this is Greg Crow Hartman and welcome to ArchConf 2020. As a longtime Arch Linux user, I'm really happy to see this conference happen, even when we're all stuck at home. Thanks to all the Arch developers for maintaining such a great system that I rely on every day. Enjoy the conference. So I have about 15 minutes to give you a rough history of the 18 years of Arch Linux. So in the beginning there was crux and it was good. Highlights being it's simple package build scripts, simple configuration, utilities, but there was no dependency tracking. So the founder of Arch Linux, Judd, wrote Pac-Man. And it essentially spawned a distribution. So beginning in 2002 Pac-Man 1.1 was released, and then Arch Linux 0.1 codenamed Homer was released not far after. Good quote from the release notes is, the bad news is that you don't get a pretty interactive installer. So some things have not changed. The big selling points at the time were the i686 optimization, when most other distributions were using i386, and install once continuously update, never have to reinstall policy, and it being simple. Now simple was defined in terms of the packaging and the tools for administrating the distribution, not in terms of being simple to use necessarily. There's some good quotes from Judd around that time, such as, I've been told that Arch's documentation is less than perfect, so things improved there, and I would have to say that, yes, Arch is very suitable for servers. So the world was very optimistic back in 2003, probably pre-COVID. Our distribution releases used to have names, and some of these names are inventive, and some are pretty boring. So point 2, Vega. It came with an interactivist, Staller, that was the ASCII graphical installer, I'll call it, and a utility called Paxsync. Now this has gone to the ages, but I'm pretty sure that it was supposed to automatically update your system without your intervention, so things we don't recommend anymore. Mid to the end of 2002, Pac-Man 2.0 came out, which finally managed dependencies, and the initial script for the Arch build system, which would allow you to sync all the package build scripts and build your own packages, and Firefly was released. Months later, Pac-Man 2.1 came out. This had to support for multiple repositories, so packages got split into an unofficial repository and the main repository. There was the big move in October 2002 to GCC 3.2. That was the move from GCC 2.95 and was rather a big move, and so Arch Linux 0.4 named Dragon was released soon after. With the release of Pac-Man 2.3 in 2003, an unstable repository appeared. That was used for mainly beta releases and very, very testing sort of programs. Big changes in the distribution were the adding PAM support in mid-2003, and that stage PAM was rather experimental, so we were sticking on the bleeding edge of software usage. Arch Linux 0.5 Nova was released, notably a million package updates, according to the release notes, but we had PAM support, LVM support, Grub support, all added in that release, so we're starting to look a bit more like a Linux distribution, we know. There was two ISOs released. One had 628 megabytes of packages, and one just had the base packages of 105 megabytes. So back in the day, one CD for the entire repos of Arch Linux. Here goes all the other release names we had, Widget, Wombat, Noodle, Voodoo, Don't Panic, and Overlord. So in 2007 it was decided not to use version numbers anymore and stick with the date. That sort of continued on now, except at that time releases were going to be made as each major kernel was released, so you had the installer with the latest kernel on it. That didn't last for long, and eventually we moved to more monthly releases. So there's been a lot of flux in the repositories and their names over the years. In November 2003, trusted user repositories became a thing. So my understanding was each trusted user had their own repository, which they could put a bunch of packages in, and people could pick and choose what they wanted. In the 2003 there was some renaming of the repositories. Stable was called Release, and later got renamed to Current. I'm not quite sure when that happened. I couldn't find that information, but it just happened. The unofficial repository turned into what we now know as the extra repository, and in April 2004 the testing repository was added. And the warning was these packages will be about as unstable as it gets in the testing repository. Slightly less unstable these days, maybe. Early on there was a lot of big software changes. I think in the early 2000s was when a lot of software was being developed and turning into what we currently know as Linux distributions. So X-Free86 was replaced by Xorg. Udev appeared. We got G-Lib C, which dropped support for very old Linux kernels. Kernel 2.4 support in 2006. You can see when all the Wiki and forums in AUR opened, 2002 the forum opened. You can actually go back and read the original posts. They're still available in our forum. Never been lost, or easily searchable actually. 2004 the Wiki was created. In 2005 we got the AUR. 2007 saw the start of the Archdev public mailing list. My understanding before that, a lot of the development was handled either on IRC or on a private mailing list. And so that's probably been lost to history there. 2007 end of it. We also got mailing lists. These were renames, but Archgeneral and AUR General got their current name. So 2007 saw leadership change. Judd was booted out and Aaron came in as the leader. He stayed on as leader until 2020. So this year we changed how the project leaders were elected and made it into a two-year rolling position instead. Other repository shuffles that happened. 2007 was when our repos turned a lot like they are now. Core and extra were became, so that was the current or became core and extra was already renamed. We got the base and base deval package groups created. The base group has since became a meta package there. In 2007 also was the declaration that all core packages must go through testing after a couple of nice incidents where systems stopped booting. 2007 we also had a logo competition where the current logo that we have was created. So here goes a few of the historic Arch Linux logos. We started off with this sort of zero one ribbon style. We went into the Archer base force badge type style and currently we have the Archer design by Thay Williams, which won our competition. You can actually go back and see the other entries for that competition on the Arch website too. So summary of some of the big packaging changes that have happened over the years. In 2008 man pages stopped being put in user man and all got used to user share man as dictated by the FHS. 2008 also docs were added to packages and that includes info pages. The policy until then was only to use man pages, only add them to packages and everything else could be found on the internet. Info pages was really the driving force to get docs added back to the packages and now we can treat them for pages more like man pages when packaging. 2010 we switched to exit compression. Also user bin Python got pointed to the Python 3 implementation as it should be. Looking through the mailing list this was actually less controversial among the developers than I remember. Pretty broadly supported. 2012 we saw the intro of system D. So system D tools replaced UDEV then. Carrying on through 2012 we got package signing enabled and also the lib directory became a sim link. End of 2012 system D became the default and only supported in its system in Arch. 2013 we ended up moving the bin, it's been in user bin directories to become sim links, simplifying the file system layout. So the x8664 port was unofficial for many years or for quite some time it became on official in 2006. It didn't get any multi lib support until 2011. It was actually officially supposed to be pure 64 bit until that stage. And in 2017 we said goodbye to I86. So when we started Arch, I686 was nicely optimized at one of the selling points. We really haven't moved on optimizing our binary since 2006 when the 64 bit port was added. A bit about spin-offs. There was a spin-off in many architectures so we've got I586, PPC, Spark, MIPS and ARM. There was spin-off distros. The first one really started, well first big spin-off was the KD Mod repo which hosted at the time KDE was a big monolithic package where you got pretty much everything in one big package. The KD Mod team split it up into lots of sub-packages and actually designed what would now become a package splitting. This turned into Chakra in 2010. Since then there's been many other distros and distrolets added. There's also been spin-offs such as Arch Stable which was supposed to be for server usage and supposed to be more point release distribution, ArchBSD, a BSD port of Arch Linux, a Herd port of Arch Linux which I think is still currently available in bootable. Other fun observations I had from going through all the archives of mailing lists where there was at least three newsletter variants and an EZN over the years. There was a bunch of attempts at automated security monitoring, automatic package building, distributed package building that all fell to the wayside so it's good our security team currently is in a more established position. Lots of debates over which C flags and LD flags we should use while packaging both for security and optimization purposes. In 2008 you could read about the debate of choosing SVN over Git for managing our package scripts. Turns out SVN was chosen because we didn't need all the power of Git and SVN did the job quite nicely at the time. Another thing you can find on the mailing list is lessons about not doing a control C in the middle of an update on a remote server that hosts most of the Arch infrastructure. Luckily someone was available to fix that. And with that quick summary of Arch history I'll hand over so we can hear a bit more about what's going to happen in the future. Welcome to the second part of Arch Linux past, present and future. Now we're going to talk a little bit about the future but before we do that a small disclaimer. So basically this is just a view into a possible future. The future has not been written yet as we know so everything is possible and not everything I present here has been finally decided. So it's basically a mix of general consensus and personal ideas. So take the information like that. First let's look a little bit at the agenda. So we're going to talk a little bit about our culture because this is very important for our future. And also we're going to have a small insight into the concepts of change and then we get a little bit technical by discussing how we regain some of our lost excellence and also talk a little bit about future improvements. First a little bit about myself. I'm Levente Poyak and I'm a full stack software and security engineer and I also love doing dog ops duties. I joined Arch in 2014 as part of the security team and today I am also involved in packaging and development and also in DevOps work. I've been elected as the current Arch Linux project leader in 2020. So let's focus on one of the very important aspects of our distro which is our culture itself. This basically we should a little bit think about how we should approach things to succeed. This is a very fundamental part of our distribution after all because we are people and not just technology. So what I mean by that is we should also look a little bit into our team spirit and how we can extend and expand our current team spirit. At the end working together is always more efficient and also more fault tolerant. So sometimes there were some ideas of having package teams or some core teams which are responsible for a certain aspect or certain subset of packages. But I think because a little bit of our nature it more went in a direction that we have package maintainers maintaining software they're actually interested in which is totally good. So I believe not something we need to change. But what I want to talk about when it comes to our culture related to that topic is and there has been some trend lately which I really like and want to see being extended is that we stop seeing packages as my package or thinking in terms like that. Because at the end we are all serving arch and being a package maintainer is just a duty. What I mean by that is we should extend co-maintaining packages having a little bit more team effort like having two or three maintainers for complex packages or something like that. This is a good idea because we reduce bus factor by that and I still believe it's very important that we have certain dedicated people attached to a package because domain knowledge especially for complex packages is always very important and you lose domain knowledge if you just share it across everyone. So I think we are doing fine. We should just improve the team spirit of having co-mainteners. This is actually I think very beneficial for our distro. Another part of our culture is communication because as I mentioned earlier we're all humans and the human part of our distruing through weird technology is very important because we the people are who creates the technology, who maintains the technology and it is super important that we maintain non-violent communication and extend also in this area. It's super important that we are all always friendly and excellent to each other. This should be normal but sometimes it's not as easy as that especially in tech sometimes it leads to heated discussions. I believe one reason for that is because we are dealing with a lot of text which also lacks some side channels like emotions, voice and also facial expressions. So if we take that into account we should always know that there are different ways how to interpret text and if we do so we should or the most important factor here is also that we expect the others don't being evil minded even if it feels like being a heated discussion. I think nobody wants to hurt someone else. It's not about that. It's that we have some needs, some topics that are important for us and we want to get a point straight and so always try to sync it like that. On the other hand contrary to that always try to write texts and follow a discussion in a way that it is less likely to be misinterpreted. At the end we are all volunteers and we want to solve problems and we have our opinions which is a totally good thing but we should keep it to technical facts and opinions and we shouldn't discourage some contributors or this has bad influence and we should focus on the great parts here. Nobody is evil minded and we should always keep that in mind and don't discourage work, free work. So the last part about our culture is improve openness. What I mean by that is that we make it a little bit easier or a little bit more modern for external contributors to join and help maintaining and improving art. We will also talk about that a little bit later but one central part would be like the interaction with patches and things like that. I believe we can improve it and this is actually not very far future so we will talk about that shortly. So the second part now is talking a little bit about the concept of change. This a little bit goes hand in hand with culture actually. We should have a different mindset about change in general. I feel like there is often a little bit of fear involved when change comes into mind. So always be open minded about change. Change should be about reanalyzing something and about concluding and deciding based on current facts and the current state. So we should keep it at a rational level. It will also make us more flexible and more adaptable. We should also treat it as something positive. So there is nothing negative in change. The world around us and technology in general is always changing and evolving and we can't, well, that's just the fundamental nature. And this is actually good because we also want to evolve. It basically can mean two different things. Either we are adapting to a new situation or it's just part of evolution itself. And so we should treat change really as something positive. It's not something negative. It also doesn't mean if we change some aspects that it invalidates the past. This is absolutely not the case. There is nothing bad about change as we, as I mentioned right now, it's just either adapting to something or generally evolving. And also success in the past doesn't guarantee success in the future. This is also very important to take into account. And at the very last, change also doesn't invalidate or disrespect the past. It can be totally great the way we haven't been doing things that led us to today. But that doesn't mean that we shouldn't reconsider some aspects. And this is why I just urge that we are always open-minded about change because this will bring us forward. So now let's get a little bit technical because I believe this is also one of the reasons a lot of people are watching this talk or this part of the talk now because they want to hear about actual changes. So let's get into that a little bit. The first thing I would like to talk about is a little bit regained lost excellence. What I mean by that is well, we have an 80 years ongoing journey and some of our excellence, well we lost it kind of. So let's talk about the first topic which is optimized architectures. Back then we basically were having optimized architectures because we are in a very early stage already providing a new set of optimized packages, architecturally optimized packages. We lost this excellence because we stopped evolving on that aspect and we just stayed to what we introduced back then for a very long period of time actually. This is also because we have some technical challenges we need to tackle before we are able to have further optimized architectures or also potentially more architectures in general. But this is something we lost somewhere on our journey and we should think about gaining or regaining this excellence. So it could be either repositories with subset of optimized packages or in general have more optimized architectures for all packages. This is not something we decided yet but I think we are all agreeing on that we want to have this. So at one point when we have some of the challenges we are aware of that are holding us a little bit back we should really have an open discussion about how we regain this part. Actually right now it's not too far in the future because we have a lot of changes made especially lately that will help us to regain this. Another aspect I want to talk about is modernizing package sources. We are still using SVN to maintain our sources, our PKG builds which are basically the description about how to build packages. This is a bit cumbersome in some aspects to deal with it. Also it is not very open if we think about we talked about openness in the first slides. It makes it hard to contribute right now when it comes also to packages. People try to create diffs and patch files and append it to bug tracker and things like that but this is not really streamlined and this is not really a great process for contributors overall and also not for us as team members to deal with all this. I think this is pretty much the established consensus so we are actually in the middle of having a transition. We are working on having the migration ready including everything related to it as well as changing some of our infrastructure and our tooling that will allow us to get there. So Git sources are actually very near future and we will get that hopefully very soon. There are just some tiny details we need to figure out and solve. The third part I want to talk about here is signed repo databases. This is not actually something we lost because we didn't ever have signed repo databases but the world around us changed a little bit and signed repo databases repository I mean where we are pulling software and packages out of is defective standard today across major distros. So we should really have that as well. I've been working on some proof of concept also with help of others and also with the help of DevOps looking into how we can have a bare bone, very secure locked up servers. So we will have some discussions about that because we still try to I mean people had concerns and we shouldn't just invalidate it but we should take it into account so the current proof of concept we are working with so far took the concerns into account and I think there were some nice solutions that we could tackle this topic. So we will raise some discussions about that shortly and hopefully also gain some repo databases in a near future. This also has a little bit to do with Git migration but we'll come to that when we are actually discussing this topic in the future. So now let's talk about actual future improvements. One major thing I want to talk about and this is also the first bullet point is accolade, delayed package updates. What I mean by that is timely package updates are our core value. We are rolling this through and users are expect that our packages are always up to date and we are rolling fast. This really our core value. Now some of you may ask why we need to accolade and what I mean by that. The only thing I mean by that is that right now we don't really have a central way of detecting upstream updates. So basically it's a per package maintainer effort to somehow keep track of upstream sources. In some areas it works great and in some areas it actually doesn't work out that well and sometimes it takes weeks or months and multiple releases until a user flags a package out of date on Archweb and then we finally roll an update because a package maintainer was not really aware maybe of that and I don't think this is a people problem so we should not start yelling at people while you're in a properly keeping track but I think this is more like a tooling problem. We should solve it with technology so I've been also lately playing around and toying with something we call ourselves send crawler so this is also something I will erase in the future having something technologically integrated in Archweb which is able to automatically flag packages as out of date and this will also accolourate at the end delayed package updates. One of our core responsibilities. Second and this is a new topic I want to talk about is integrate reproducible builds. We have been really doing great achievements in this topic and the topic of reproducible builds there will also be a talk of Morton the state of reproducible builds if you're more interested in that topic but it is basically a part of supply chain security that we validate just to phrase it maybe in one single sentence that we validate that the least packages are and can't be backdoored because you always build the very same thing from the sources we are committing and maintaining so there could be no sneaky patch invisible ninja patch which implements a backdoor or something like that this would be detected and integrate and the integration part I mean better integration to archweb in terms of icons, in terms of api and finally also user facing tools users should be able to query the state of their system related to reproducible packages and maybe in a very far future we could also dream about something that reproducible packages should be the standard and people should be able somehow to tell I only want reproducible packages not install non reproducible packages of course this is very far future and we need to solve a lot of problems that we are facing if we want to achieve that but this is still a dream we should think about for a future the third topic is implement single sign on this is actually a very near future because we are already working very hard on it and we are also testing it and for team members this is already present so we need to integrate it in all our different services a little bit better it also helps a little bit in terms of community openness and we will also have external login providers which helps a little bit well to eliminate some kind of annoying factors and it also simplifies from our team perspective the on and off boarding and is less error prone and it's just something great so we are very near on having that and a certain final topic I want to talk about is a little bit of consolidating detached areas so we currently have already GitLab and we are moving some or most of our projects to GitLab we need to solve some problems first and also be able to open up the single sign on our platform which allows to use GitLab also for external contributors but then and it's something I personally would like to see is we should think about consolidating all our fragmented tools basically our Kanban boards our bug tracker and yes, things like that for some parts like the Kanban board we already have a consensus on a single aspect so there will be some discussions but I personally would really love to see it integrated into one platform because then we gain a lot in having it at the central place especially when it also comes to packages packaged sources PKG builds and bugs related to it it's very easy then to deal with everything and have cross references in terms of issues and sources and merge requests and also Kanban boards about some epics and things like that I believe GitLab is there really a nice tool that will help us to streamline some of our fragmented areas so we should have a discussion and see how we can extend platform or how we want to use this platform in the future so let's see but more about that in Sven's talk Architecture at Arch next to this talk so join in and listen to him and the final words I want to say thank you all thanks to the community you're actually what Arch well Arch wouldn't be where it is today we wouldn't have you so we are very happy about our community and we are also inviting you all to take a look into maybe how you can get involved or how you want personally to get involved if you feel like you want to do that and there is this wiki link you can follow it it gives a little bit of explanation about the different aspects and the different parts where you could get involved and start helping Arch more to stay what it is and to evolve even further so a big thanks to our community as well hope you enjoyed the talk and now we can go over to a little bit of Q&A and welcome to the first live segment of the Arch Enix of Arch's conference so I'm here with Leventha the project leader and then Alan McRae who had to teach me how to pronounce the name so I think we'll just kick off with questions so the first question that arrived was what are info pages from Verst if anyone wants to explain so info pages are like man pages but a bit more verbose again for access mainly via command line so a lot of new projects like GCC have big info pages and not very much man pages and so that was the drive to get them back into our packages interesting so another question from March March the 12th will there be another split in repos Leventha probably I'm not totally sure what split in repos is meant here I'm not suspecting maybe because of the optimized for optimized architectures it basically depends on the decision which way we want to support optimized architectures so this is not something we could answer right now but besides having optimized architectures I don't really think it would make sense to split it even further so yeah probably for Alan where does the name Arch Linux come from from secret oh so Arch is from Arch enemy I'm not sure why it was chosen but that's what John came up with I recall reading about primary as well being part of the reasoning from post factum what I don't understand is SVN choice why the SVN choice so back then it was basically not that widespread compared to today because today it's I guess the most used version control system and also in the developer and technical community it's very well known so I guess the decision basically was because it was not that known back then not that widespread and it was just simpler to use this was why it has been decided like that in the past which is totally fine because it made sense back then but as I also explained today times changed a little bit and that's why we are now in the transition to JIT so there's been a few questions about Arch Linux ARM so March the 12th asked will there be an official ARM port if it gets popular like if ARM gets more popular and another question is how about Arch Linux ARM into the Arch Linux fold third question is also why is alarm not more tightly integrated into Arch which all ties together okay so I think the ARM architecture is pretty popular so some of the reasons why was having to have kernels for specific machines it wasn't so easy as just I686 and your plot provided a single kernel there was a lot of different architectures to choose from so what support also the alarm people they do an awesome job they've got a good infrastructure they've set it up quite nicely so they're doing their thing very well so there's not much point us replicating it so the question would be more will they merge with us at some stage and the topic's been broached a few times I think us not using Git for our package management is a big point because they use it for theirs so there's potential but there's a lot of factors in the background of sponsorship of machines and everything that they get with their branding and how that would work if they merged in with us so a lot of issues to be sorted and they're doing very well without us so we can let them continue so what about Felix actually was asking what we're thinking about with Risk 5 I know Felix has been working on it but what are your thoughts well I have no thoughts on Risk 5 I haven't considered it I think basically it will boil down to how much interest inside our community or inside of new volunteers related to Risk 5 appears because at the end it is not enough to have just a small tiny portion of people being interested in maintaining it of course maintaining a new architecture will also gives us a lot of work in terms of support and other challenges related to different architectures so I guess there is no clear answer yet because there was no discussion about it but I'm inviting to start discussion and we will see where we go yeah I'll also point out the 64 bit architecture started out as a community driven project and then when it got popular got moved into Arch so there is past there because there's been many many different architecture ports of Arch that have fallen along the way side over the years so when it gets big enough maybe so we also have a question about diversity in Arch probably for Leventa how is the diversity of the people in charge of Arch I would say this could be improved but basically it depends on people who are interested in doing work so if you just look at the pure stuff right now it is not very diverse I'm personally not really sure what the reason for that is but we would love to see it being a bit more colorful so nice so I'll just skip a few questions from the audience because we don't have that much time left we can take one of the more CPU questions which has been going on but what's the current state of the discussion regarding dropping support for old CPUs what would be cut off at the time of that CPUs work being considered as too old and at the same time I have a core to do that's perfectly fine as I had this in box which was asked by Malaku so CPU optimizations and stuff I suppose this came from me trying to push some more optimizations into packages maybe a year ago now so I think the discussion is more focused around providing additional architectures rather than dropping support but I will note back in I686 days we had the Fire C3s that had almost I686 support and often failed when an optimization landed in a package and we just said tough so I think it's a decision that will be discussed as we work towards more optimized architectures whether we continue supporting plain x864 I would assume we would yeah cool so for Levente here is Arch Linux lacking in volunteer power this is also asked by Lirst so this opinion or this personal opinion I have for quite a long time I think we really could extend the area of having people using testing and giving more feedback for our testing repositories I think we are very much lacking volunteer power there to do it better than we are doing today it works for some packages more like core packages and that are not that much of a problem but we also want to be able to use testing for community packages and not so critical parts of the system which in my opinion is not very well tested so this could be something we really search for new volunteers who help us use testing and give real feedback to it but in general basically in every area it depends on just people who want to involve time we are happily inviting people to join and help us of course I wouldn't say we have any area where we totally need absolutely no volunteer power at all just look up what you like and help us but if you want one area then help us in testing it's your time for one more quick question how does arch plan continually succeeding as a major distribution going forward either Elm or Leventa I guess this is a tough question to answer because how I mean this is a little bit what I talked in the future part we should just look over and over again at our excellence and see if we can still held it up until today and if not how we adapt to new situation or where we see areas where we want to just evolve a little bit further so I couldn't pinpoint one thing on how to do it but we should just be open and go with the time and yeah just raise a little bit hand in hand with regaining some stuff that we lost and also looking into more future proof platforms for contribution but this is also something Sven will be talking about in the next talk a little bit so yeah I would say just be open and look at the current state more often so that's all we managed to get in on the Q&A session super many thanks to Ellen for answering questions and Leventa and then we'll go over to the next talk soon bye bye bye