 OK, hello. So today I would like to talk about desktop Linux platform issues. And I don't claim that I have the solutions to all those issues that I'm pointing out, but sometimes pointing out these issues can be the first step towards a solution. And this is why I think it's very important to get a conversation starting around the kinds of issues that still plague the desktop Linux in 2018. So we did have to, ah, here we go. All right, so who am I? I am just a random guy who likes stuff that just works. So I'm a Mac user, and I tremendously like Linux live systems, where you just launch something and it works without fiddling around a lot. I'm also a distro hopper, and I have been working on this topic of application distribution since well over a decade. First with what was called the click project, which then evolved into app image. But this is not what I want to talk about today. I would like to talk really about underlying issues that I discovered when deploying applications for the Linux platform. Desktop Linux platform issues. Linux on the desktop is really a very niche operating system. And we need to keep this in mind all the time. We have less than 3% of market share for all of Linux together on the desktop. It's really dominated still by Windows and Mac OS. But it gets worse. These under 3%, we splinter across a tremendous amount of different distributions, hundreds if not thousands, which happen to not agree on many things. It doesn't stop here. There are also at least tens of different desktops. And some of those desktops today are very opinionated about how applications for those desktops should look like. Now if you do a simple multiplication, take these 3% operating system market share, multiply this with the distribution of your choice, multiply this with the desktop of your choice, and your market share is gone. This does not scale. This resulted in even Linus Torvalds not wanting to provide his own application for diving that he developed. He provided binaries for Windows and for the Mac, but not for Linux for a very long time. And this is obviously a very sad situation where we can't blame companies like Adobe that they don't want to offer their applications for the Linux platform. Now you could argue, do we need that? And there are arguments for and against that, but I think it would be a great choice to even run proprietary applications on Linux. So observation, real world apps are not limited to one desktop anymore. In the earlier times of Linux on the desktop, there was this idea that you have, let's say, GNOME, and it comes with its own applications. You have KDE. It comes with a totally different set of applications. This is changing. And if you look at one example here, Krita, for example, it started out as a painting application for the KDE desktop. But today it has many, many, many more non-KDE users than KDE desktop users. And I see this pattern all around. Real world apps are really cross-platform. They run on Windows. They run on the Mac. And they should run on Linux, all of Linux. Most of these are Qt-based. Some are GTK-based, still two, for reasons we will get into. And increasingly, we see more and more electron apps. Now you may think about this as a good or a bad thing, but it's a fact. Real world apps are cross-platform. Now what does an app developer need to develop apps for? A developer wants to reach a large user base, which needs to be supportable. And he wants to have as little work to do to distribute applications as possible. Really, an application developer is looking for a dependable, stable platform. One such platform is Windows. For example, if you developed something for Windows XP over a decade ago, it ran and ran and ran. And it still runs on subsequent versions of the operating system. Very similar for the Mac. You get one thing you can depend on. It works. Now for Linux, different story. Here we come to the core concept of what I consider a platform to be. A platform is something that you run your stuff on top of. Very important concept. Windows is a platform. You run stuff on top of Windows. Mac is a platform. You run stuff on top of macOS. With Linux, the typical Linux distribution, I would say, is not a platform. Because it's not made to run your stuff on top of. It's made to get your application into. That's a different thing. Really, platforms are something that allow you to run your stuff on top. Not something you need to get your stuff into. And by the way, all successful operating systems, both on the desktop and on mobile, are platforms in this sense. But Linux distributions are not. They're not optimized for running third-party applications outside of what they ship themselves. Most distributions want to package world for themselves, although this might slightly be changing now. And first of all, most distributions make no guarantees in terms of what the operating system provides that you, as a third-party developer, can rely on. An example of this mindset is taken from the Ubuntu wiki here, where it says, normally and preferably, software should be installed using the Ubuntu Software Center. They actively tell people, we are a walled garden. You use our stuff. This is not a platform where you run your stuff on top of. So distributions really don't seem to care too much about this topic, but users care. I, as a user, want to download the very latest application on the day it comes out. App developers care. As an app developer, I want to write my application and reach also these 3% of the market that are using Linux in different variations. And finally, library developers should care, but sometimes they are a bit powerless. I have an example for that. So let's really think about stable platforms, again, Windows, Mac, versus moving targets. And if I look at the Linux desktop today, it looks pretty much like a moving target to me. So Linus also noticed this. He says, we have one rule in the kernel. We don't break user space, but then come those user space guys, and they break user space all the time. I think this needs to stop. So there have been different approaches to solve this moving target thing, but I think so far most solutions are trying to work around it in clever ways, but not really address the core problem that it is a moving target. If you look at the typical stack that we have today, there is a kernel, which, by the way, has a very defined and stable interface to a user land. But then on top of this comes this moving target. And now people have realized, if we want to enable these apps that run on this, we need to build something in between, where we give users something stable, but by the way of basically just ignoring the moving target and putting another layer on top of it. Flatpak calls it run times. Snappy also calls it run times. And there is not just one run time. There is not even just one run time, let's say, per year or every two years. There are actually multiple run times. So this leads necessarily to a lot of duplication. And with duplication, I don't mean just in the sense of bits and bytes. I also mean in terms of supportability. You need to take care of all this stuff. You need to security update all of this. So really a lot of duplication. And you're essentially throwing away the operating system, which was meant to run apps in the first place, right? So you're getting rid of the infrastructure that your operating system provides and replace it or put on top additionally your own mini operating system. Instead, I think we should think about providing what I call a guaranteed common Linux desktop platform that makes certain promises where application developers can use what comes with the system and be sure that it will still run in two years, still run in three years and that everything is either part of this stable operating system or can be shipped along with the application. So platform is really a way of thinking. If you want to do a platform, you need to define what your users of the platform can take for granted. And it should be stable, not a moving target. Platforms also think in terms of backward compatibility, which means if you're developing an application, you shouldn't target the latest bleeding edge library that was just released yesterday or will be released next year. No, you should check which operating systems are your users using today, which means Ubuntu 14.04 or something like that or CentOS 7 and develop for that because that is what your users have to date. The same in the Windows world. A couple of years ago, people still develop for Windows XP. Today they're probably using Windows 8 because that's what users still have out there. I call this the principle of the least common denominator. I think we need to fix the Linux desktop platform, not work around it. And it's really messy as of today, often for no apparent good reason. Take library versioning, for example. Even within the same family of operating systems, you can have a library, libarchive in this example, and libarchive is at version three and has been for quite some time. Still, the same library is named differently in different versions of the same operating system. It's really complicated. Someone should standardize this. The question is who? The developers of the library, by the way, say we can't do anything about this. Our version is three or has been for a very long time, but distributions just change this. We can't do anything. So someone needs to do something about this. Okay. So there was a long and heated debate on GitHub. You can look it up where everyone claims he cannot do anything. So even today, I'm not sure who could do something about this, but someone needs to do something. Or let's take library paths. Different distributions just put stuff into different places. There used to be something called the file system hierarchy standard, which I originally thought would solve this. But no, despite this standard, stuff is all over the place. And then basic infrastructure. What is really the desktop Linux stack? Is there even such a thing as the desktop Linux stack? I think yes, it is. Because if you actually compare desktop distributions even from different families of distributions, you will run into the same set of libraries and core services over and over again. So actually those distributions are not that different as they look on the outside. Most of the stuff that keeps them running is very, very similar. Yes, I'm talking about you. Glib C, lib standard C++, open SSL. So these components are basically in every distribution, but someone should standardize how they are shipped so that I can rely on them being there in a certain way. Then there is the whole XORG and Wayland story, graphics drivers, audio. All of this is really basic infrastructure. Then there is the GUI toolkits, including font rendering. I mean, every desktop has to render fonts. So why can't we really standardize how this works and present it in a way that can be relied upon? Someone should really standardize this. So a lot of technical issues. There is a long, long backup to this talk with over 30 slides, which I won't show today, but you can look them up online with details, with all those issues in detail. It doesn't stop here, though. We need to also fix the usability. This is what happens if you take a runtime, an application, an ELF file today, and just double-click it. It says there is no application installed for executable files. Do you want to search for an application that can open this type of file? Also, there is no icon, just a generic thing. Very inconvenient. Contrast this to how applications work on Windows or on the Mac, where an application has a logo, and you don't have to install a Mac application to see the logo or the icon of the app. It's just there, right? Where you can just double-click. It just works without having to place text files and crazy directories and such. Since SystemD is a big topic here at this conference, I was thinking, couldn't SystemD do something for desktop apps as well? Wouldn't SystemD be a logical place to put a replacement for this copying around of desktop files and MIME types and icon files, which really is messy, and centrally manage this desktop integration stuff by having all the metadata in a database, kind of what macOS does with what's called launch services. By keeping a database, it would also be a bit more flexible than what we have today. For example, we could handle multiple versions of applications much, much more gracefully. I think we all know examples where a certain complex, let's say spreadsheet works perfectly in one certain version of the spreadsheet application, but another very complex spreadsheet needs a slightly different version. So a system like this could really handle those cases very elegantly. And then we could also solve the complexity of moving things around in the system, relocatability. So even today, if we have an application that you can relocate, for example, LibreOffice, you can put it on a network share or on a USB drive. The application works, but the desktop integration breaks as soon as you move something. This is very different from the Mac, where you can put an application on a USB stick, pop it in, double-click it, and it's integrated with the system automatically because the metadata always travels with the application and doesn't have to be installed. What do distributions do? Well, Red Hat Enterprise Linux actually has some ABI guarantees, Red Hat Enterprise Linux platform guarantees. That sounds very promising. There is a set of core libraries that preserve compatibility across three consecutive major releases. Great step in the right direction, but it's just one distribution. And if you look what they think the Linux desktop platform consists of, it's actually not so much. So this is the entire Linux desktop platform, as defined by Red Hat at the moment. You see things like GTK2 in there. Is that really all that makes up the desktop Linux platform? No GTK3? Is it really up to one single distribution to decide what makes the Linux desktop platform? Someone should standardize this, and I'm asking who. Now what? I personally think we need a new standards body that governs the desktop Linux platform. Someone should do this. Again, who? The Linux Foundation? They seem to be more interested in service these days. Then what happened to XDG? Is it still alive? There are mainly desktop environment people involved, no app developers, no app users. LSP, the Linux standards base? Is it still alive? I don't know. It's dead, officially dead. Okay, so even more, the question is really pressing who governs the Linux desktop platform. Because what Linus said on that slide, it's no wonder if there is no one who governs the Linux desktop, it's no wonder that it's a mess. So with this, I would like to open the discussion. As I said, if you go to this URL, there is a large backup with detailed explanations of the issues that I could just briefly mention in this talk. But I think it's important, I want to really bring across the point that if we want Linux on the desktop to be successful, we need to adopt platform thinking, because there are two large dominant platforms on the desktop already, and by splintering the remaining 3% as we do today, this is clearly not a viable desktop platform. So with that, questions? I'm sorry to say that your approach is incredibly naive. Usually there are trade-offs, there are very good reasons why distributions do things in certain ways. The attempt to standardize on a distribution-wide baseline was done by the LSB. It failed miserably, nobody cared. And it was much simpler than what you are trying to do because it did not focus on the desktop but on a much smaller subset of applications. If you want to target a five-year-old system for your applications, you can do that now. You will be able to install the application on all distributions, on new distributions in the old library packages. It will usually just work. And generally, distributions provide versions of applications for their own users. This tends to work, depending as a Debian user, as a Debian developer, I do not feel much pressure in installing applications from outside of Debian because mostly just about most everything is there. I'm sorry if your distribution of choice has much less applications. Thank you. The distributions I mentioned were, of course, just examples. I think this is more or less true in every distribution. For example, I'm a 3D printing fan. There comes out a new version of my slicer that I use, which is announced today. I want to use it now, not in five months when it's in the distribution, now. Also, I like to run a stable operating system that doesn't change every day, where the system itself is known to be there for years and years. I'm maintaining a system for a random friend that's 10 years old because he wants stuff stable, or take companies, for example. They certainly want a stable operating system, but I don't think distributions haven't solved this so far, no matter which distribution. Also, I think the traditional distribution model doesn't really scale once you get into applications that are very long-tail, niche applications. For example, an application that's made specifically for one school. They don't want to distribute their application to the whole world. They just want to give it to their school community. Distributions are great, in my opinion, for managing the base system, the core operating system. But they're not ideal for all those leave notes out there. Yeah, I think you already pointed out the fundamental asymmetry here, which is the root of the problem, because an app developer stands as always, I want to push out my newer version the minute it comes out, but nobody else around me can change anything. And if everyone has that attitude, this can fundamentally not work. Like a 3D printer. You know, I think 5 years ago, in an old distribution, you wouldn't even have a driver for it. So, you see, in essence, the only thing you can do is to not rely on any platform and ship the path of your desktop yourself, which is essentially what happens now with snaps and flat-pack. And I guess that's a good intermediate way. We should not try to repeat the failure of LSB and try to standardize in GTK2 where in 5 years nobody is interested in maintaining this anymore. I think the real question is where do you make this stack split? Right now it's on a very low level. It's on the kernel. The kernel interface is stable, and everything on top of the kernel is basically a changing system. So the question is, is there another set of infrastructure that is generic enough that it would make sense to also specify that higher level abstraction and essentially freeze that or do version releases every two years or so. That's the fundamental question. Hi, I just have, like, two remarks. I do agree with you on some extent that it will be much easier, but the first thing you bother me is, like, market share, which is very emphasized, and market share means basically how much revenue you get from a platform and... Or users. No, not really, but market share is the definition, like, how much money you get. And, of course, if you sell a platform, your market share is going to be much bigger than if you give it away for free. That was just one remark, and I was a Windows developer for a very long time, and I can tell you that Microsoft breaks stuff all the fucking time. So, yeah, good luck with running some ancient stuff, and they just sold it, but basically providing emulator for all Microsoft versions so you can run it basically in a virtual machine in a very non-technical way of explaining it so this is kind of why it works, but we could theoretically do this also on Linux, I mean, if you really want to. So, that was just two remarks. Thank you. So, yeah, sorry, maybe I didn't pick the correct term. When I talk about market share, I really mean a number of users. Take this manufacturer of 3D printers, they make nice printing machines, and then the question is, for which operating systems do we provide our software? And, of course, such a company looks at, well, there are all my Windows users, there are all my Mac users, it's clear that I need to support them, but then there are these 3-ish percent random people, and I think that's the core problem. If we provide them with a clear target that they can produce for, it's more likely that they will come up with Linux software than if we present them with an ever-changing mess. That's our problem. It's it, go ahead. First of all, for many years I haven't got privately any Windows box, and I believe that at first we have to define who is our target, and we have to find the business case, because Windows actually drives us everybody. Even if I think of Mac five years ago, I had my last Mac, and I bought Office, and actually it was stated at the Microsoft page, it was not compatible to other Office versions. So that was the problem, because if customers send me the format, they expect me I will return in the same format. So LibreOffice for business use no way. It can be nice for my private tasks for my children to print it at home for the school, but not for business. But my child probably won't buy Linux, or maybe the platform we want to sell. We first have to find the business case, and then we will find solution. I think discussing what an Office program can or cannot do is a bit outside of the scope for this talk, but I mean if you look into many domains, newer domains like 3D printing, the picture looks a bit different. There you have standardized open formats where you can use the same files on all platforms. So I think slowly we are moving toward where you can actually enable these cross-platform use cases. And a lot of the applications I showed in the beginning are cross-platform and you can just open files on all operating systems with them. I really like the idea you are presenting here, but also understand why the people are reserved a bit, or what Linux is, build your own system. It is a fundamental part of it. But I have a question for you. What do you think actually about the Android project? What do you think about it? Yeah, so Android or probably the same goes for Chromium OS, they do this thing where you have a standard operating system with an interface that defines what an operating system is. What do you think about it? What do you think about it? Yeah, so Android or probably the same goes for Chromium OS. They have a interface that defines what an application developer can expect, and then they basically for at least some time leave that unchanged and only add new features and new versions of Android, but don't take away the old ones without a depreciation notice that is long enough for an application developer to make changes. I think that is a very good model to do it. Of course it is also a bit easier to look at what Android is, what the APIs are, that is not so easy for the desktop Linux world, but still I think we should think exactly into this direction. By the way, if you look at what an Android application is and why it is so easy to install and uninstall, an APK is essentially just a zip file that contains everything that is not part of the operating system platform. So a file can only be in two places, either in the Android operating system or in the zip file that is the app. And this simplifies things incredibly. By the way, this is what I am doing with the app image project as well. Hi, I would like to add another angle to this discussion. I work in a company who develops for the Linux. Basically all the stuff we are working on is running at the end of the Linux. We drop the office, so move on to the G Suite, so everything works in the SAS applications in terms of the business usage. So it would be reasonable to take the Linux as a desktop for the developers. Yet we have to fight with our internal IT because in addition to all applications for the day-to-day use for the end user we have all the crap that is at for the corporate use case like big fix, carbon agents, deep inspect, semantic endpoint protection that is installed on the windows and the windows is enforced on the end users in this case developers for the so-called IT for the corporation. So that is also the additional pain point for the desktop on the Linux and for the spread because if we could also address that thing that would vastly increase the market share for the Linux desktop. You're absolutely right. So I know a lot of companies that work exactly as you describe what we see is if you go to a random company you see guys with two notebooks on the desktop one corporate provided and one nice then you usually know that these are the developers because you can't work on these locked-down machines. So bring your own device is something many companies already do and I have seen it in a lot of places where the work PC is this locked-down desktop windows machine and the actual work gets done in another way. So thank you Simon for a lot of time but I guess there will be plenty of time to discuss this over lunch outside of the projects. Thank you.