 Oh, that's bad business. That is bad business. Hey, good morning. Oh, we got it, though. We got it all worked out. Oh, yeah, it's here. We just, the guy that was here helping us, we just swapped it out. Just played for it. He got here, and like, oh, no, enough. I brought my Raspberry Pi, and something like that. But I don't know what that is. OK. All right, thanks. Everything working for you? Yeah, as far as I can tell, how does it sound? I'll just sit on the back, and just come up with now. OK, this is what we're doing right now. Test, test, test, test. OK, all right. OK, OK, test, test. Sounds OK? All right, hey, thanks. Hey, Ken, what you up to, man? OK, now. Oh, OK. Oh, you're going to hackle me a little bit? That's why, hey, man, I'm just doing, you know, Richard. Yeah, it's fine. It's fine. Yeah, Mark's show of the work couldn't show up, so they got me, so. Have you gone a little bit high? Yeah, well, I don't know about that. Isn't it, man? How you doing? You know, all the projects, just to help me about it, man. It's been nuts the past five months. You know, I have a part-time business. I need photography. And it's like I'm working with, there's three actors that I'm working with. Two of them. One just made a film short that made it to HBO. Got a name Anthony I'm working with. And then a lady by the name of Kennedy Hall. She has a series that she's going to start shooting this summer. So I've been doing a headshot portfolio work for them. And it's, you know, a lot of times people just meet people and you just do favors for people and whatnot. And now, young people, I had no idea that it has so much to do with them. And it's like three, at least two degree times a month, I'm just shooting headshots and stuff. And it's not about the money, it's about the exposure for you right now. But I had no idea that after you shoot, that's where all the headache begins. You got to do color grading. You got to do retouching. Explode what you have to do. It's going to top that. You know, there's still my regular died father. I'm kind of that quote. I'm like, you know... I've been working with a photographer that's actually been doing all these accounts for a reason. I was a fan of his. His name is... He's going to have a gallery. He has his own studio. He's a photographer. He has a computer. He has a door. He has a fireman. He's the only way to... What do you plan to do? You know, creating an image too. And a lot of people, you know, it's like creating a human figure. Oh, okay. Now that you have all this... Is it really artsy kind of stuff? Okay. But he has to do the other anyway. He has all this stuff in Hollywood. You've done Hollywood. Oh. Is he a young guy over here? Oh, young guy. Young guy. The mixer is pretty good. He's just a real... You working out? What's going on? Getting ready for the break. Oh. I don't see it. I mean, what's it all worth anyway? You know... Man, I'm trying to like extend that. You know what I mean? Trying to extend this. Man, you got work to do. Get ready for the break. It's crazy, man. You know what I mean? Yeah. What are your requirements? You're going to be able to fill in some drafts. One and a half to four. Okay. And then work on the back end. And then do the interoperability. Because it might take you to a region around the world. Standing in data models. And screens. So there's a culmination of a lot of stuff going on. Right? I thought my open open APS and the project was going in that enterprise. I was going to ask you. You got you doing the... You know, the computer and stuff. Oh, you got away from that? Well, I did one with the library at the same place. Call me. Hello? Who? Hello? Oh, you were at the wrong number, sorry. Could no one contact you? Double people. Oh, okay. Like I said, that ate up a lot of time. I hadn't had to admit, except for an important amount of time. So when Richard had asked me to do this, I was like, okay, that's only 30 minutes. I'm not on good. And I can just geek out and pull on this monkey suit. And I can just do what I love to do. The only thing I do for myself, monkey, is it... after that? I'll tell you. I got a job. It's been crazy. You're at the school. Yeah. Well, the problem with what happened is the district said, okay, these are strategic plans. We're learning our technology plans. We will all be using innovation. And they haven't put anyone behind anything. Well, I'm going to talk a little bit about funding models and stuff that make the country more patient. But the one thing that got me wrong was that they said, okay, guys, we're going to buy homework in 2071. But they had all these technology initiatives like this time we want to see this. We want to see this. We want to see the technology plan in this life. We want to make sure that the kids at the six-ray level are exposed to more processing in their circular treatment. Stuff like that. So they would give us little bits of funding. And so we would buy, you know, the quote, the carts that they would live. But they were going to train them. So teachers would let us do that. And I'm like, okay. So we're not... We're not going to make money for the hardware and the proprietary software. But there's no training and support because the teachers don't feel comfortable using this stuff a little bit. So the planning part is like almost nothing. They're just responding to funding, training and to pressures in terms of their requirements. Exactly. And I think it's a joke because then they'll say, well, this is too much. So now we're not going to go and apply for these block grants because some of the funding that is provided at the state federal level is real real first aid, which means you have to fund a project and you get the money on the back. Because the debt and the local networks to me are more important than the cloud. We have the same thing. So the debt, what's that bill? What's that bill? Is it consumption? So how do you and doing red sheet from Google is not the other side that they have people register with a long line. I'm good. How are you doing? All right, all right. We'll just wait for some of the people coming from registration. But it's just what you write. I think the local control of your technology gives you more options. The problem that we have is we want to move toward the Chromebooks for a couple of years. It's easy from the IT department to maintain them. But do you see what we're looking for? We're looking to do the requirements. Yeah? You're right, but the problem is you've got a bad habit of telling the truth. Right. You know, he's got this in. Well, I'll get off of my hero. This year is the year when I finally integrate into private sector because I have a stronger understanding and I'm confident in my non-profit but I think the poor profit I see a place in how to dance with that now because what you do is you do the non-profit and when they say no, the poor profit works that's been funded and the wrong is you have a return on a bash net or something like that. Right. And then the government and the non-profit isn't greedy because it happened and we weren't impeded by all the bureaucratic time, weight, the energy that goes in. It's funny that you're making that and the funding that supposedly was going to give a boost to technology in 2009 when Pope Palmer did the stimulus package right, we had what was called error which was the accountability and recovery and the investment act or something to that effect. These federal and state funds they would say oh, we're going to provide a billion dollars. The question was were they going to provide it and what did it look like? They basically said we're going to provide a trillion dollars and most of that money would be what it is, the reimbursements and they'll say a billion over exactly, because it's a billion over 10 years and that's supposed to cover the entire country, all 50 states will hear the problem is that the money wasn't all in the form of grants so you would forward fund for your project and then you would apply for reimbursement on the back end. But since then you're waiting for the money exactly. And how do you stay alive? Exactly, and the thing is is that they don't always give you 100% reimbursement since it's categorical for free and then the legislators get in and they say well, we didn't get enough people applying for this so they narrow the focus for the coming years. Exactly, and they say we're having a cut across the board so we need to accept it. And another thing, remember the California Lottery was supposed to save the schools? Oh yeah. Because it was never really designed to supplement but it was designed to supplant funds. So the thing is, say for every hundred dollars of money that came via the state, in the form of 88 or line item money for technology and education what they would do is they would say oh well, you get money from the lottery. So rather than supplement making it bigger they would use to supplant so they took it out and they said now you guys have a fair amount of money extra. Are you ready? Yeah. I wish for the Orange County Rescue Commission. Oh yeah, you know. I know. I know. Here we go. I'll keep on. What title? I am a teacher I am a teacher Good morning, thanks for being patient with our late start. We still got some people coming in from registration so it goes. Thank you all for coming. We have a couple of changes from the schedule that we had had before. As you know, if you've been to Ubu Kans before there's always one last minute thing that shows up that we try to squeeze in where we can and we had a good opportunity. We're going to talk about our talk. So on the schedule tomorrow at Friday at 3 Jose Antonio Ray was able to come out here and he's going to give a talk about identity management. So if you run an open source project or you are wondering what it takes for someone who runs an open source project to handle user registrations handle all that data in a safe and protected way handle the logins of registrations and so on, as well as some of the new GDPR regulations Jose is going to talk about what that takes, how to handle that and how that applies to Ubu into servers for example. So it's an interesting little talk and so at 3 that will happen and then directly following that at 4 we'll have our U and Ubu into open sessions so we hope to see a lot of people there and so as things are sometimes a little fluid at scale you can always check the online schedule the updates for that will happen very shortly the opening talk tomorrow is also not on the schedule yet that's going to be the why snappy by Alan Pope and he's going to talk about snap craft and some of the tools that developers can use when they want to develop and deliver apps, not only to Ubu into users but to the other like 15 other distros that support snaps so that's Alan's a great guy and a wonderful presenter and a friend and so check the schedule online for updates the displays outside have the updated schedule and that's a bonus if your printed schedule doesn't have it so just be aware there's a little bit changing, a little more detail more than dramatic changes and I'm going to do a quick plug for that last session that follows tomorrow, it'll be our last session for UbuKind because the most important component of that talk is in the room right now and that's all of you it's a weird session I haven't done anything like it before but it's a topic that's near and dear to our hearts and hopefully a lot of yours too and that is the role of Ubuntu desktop what do you love about it what would you like to see in it what can you think of that might help evangelize it so I'm mentioning it now so that as UbuKind unfolds over these next day and a half you can think about things you want to bring to that discussion that discussion is going to be a brief intro from me I'm going to just brainstorm for that time and talk about why we love the desktop and why isn't the entire world using it every day that's a 430 tomorrow yeah 430 340 if there's a conflict between us follow his guidance and if there's a conflict between me and what the displays say the displays are right because that's the fact-checked things I gave to scale and any conflict between print and the schedule as with all of scale always go with what's happening online and there's an app they've updated the app again this year for iOS and Android and they also have the schedules here on the website it's very good too based in Drupal with all that out of the way let me introduce our first speaker Dr. Sam Coleman is an educator and technology coordinator and he has brought a talk on Raspberry Pi in the educational context and the adventures thereof hey good morning how you guys doing alright welcome to ubercon I want to thank Richard for inviting me to come out and bump my gums a little bit I've been coming to the scale conference I think I got here in the fourth one my 13th and 14th year and it's always cool to see old friends and to make new ones so I'll get on with it so my talk is called the free computer lab now you're probably wondering who is this random dude and why should I listen to him this random dude or this handsome devil right here is me and I'll give you a little bit about my background I've been in the field of education for about 20 years and I've worked in a variety of environments I've been a teacher coordinator, program manager I've been a principal at a high school and now I'm on the other side of the hill back in the classroom and I'm kind of liking it and I work over at the Rialto Unified School District one of the things that I've been able to do with the help of a lot of good people here at scale is that I was encouraged to pursue my doctoral degree in educational technology and I finally accomplished that goal with the encouragement of a lot of good people and so for those people and you know who you are, I want to say thank you if I haven't had a chance to thank you formally so if you guys want to check out my dissertation just google my name, Samuel Coleman dissertation I actually open source my my doctoral thesis that means so I went all in every tool, everything that is connected with my dissertation was done with open source technology the analysis, the software the word processing, everything so if anybody wants to use any part of my dissertation that's available to you and I was humbled by the fact that thousands of people have actually downloaded it and I'm thinking maybe they just couldn't sleep so they needed something to help them go to sleep so that's a little bit about me you're probably wondering what did you do you see here there is an example of a computer lab and that's what I've been doing over the past 20 years or so is putting together computer labs in environments where there was no money and maybe not much interest in providing those supports for children who come from low income backgrounds and so I've used a variety of tricks and I have bumped heads with some of the best superintendents and IT directors in two states here in California and in Texas to make that happen I am nothing special I think that you guys are just like I am a lot of you have just tried to figure out how do I get something to happen you know I wasn't raised with a silver spoon in my mouth I was just an African American kid in southeast San Diego back in the 1970's who started my journey with technology I wanted a TV I wanted a 9 inch TV black and white TV in my room so I talked to my mom about it and she said we can't afford for you to have a TV in your room so I was like wait a second how do I make this happen so like many of you you start to tinker with old broken things and you read books and you talk to what I call the gray beards and you go over at Radio Shack or Nielsen's electronics and you ask questions and you go to the library and you figure it out because at the end of the day I was told early on that there's a solution for everything there's a way to do everything it really comes down to are you going to put in the elbow grease and the sweat equity to figure it out and I was told early on that it isn't always about money and so when I was faced with my first challenge back in 2002 I was a new teacher at Castle Park Middle School and I had a group of special education kids that I was providing support for a little bit about these kids these were students that had bad behaviors and they were kind of known for tearing up the computer lab so to speak finding ways to tear keys off or to actually break the mice and do all kinds of heinous things to this technology and so they were effectively banned from using it and so on the service you might think well if they're tearing stuff up they don't deserve the privilege I saw it differently I saw it as a means of providing some level of support to these kids and helping them to get to the next level by saying hey this technology is very cool you can type your papers with it you can go on the internet you can find information you can do different things but I remember the first time I took my group of students over to the library I was told unceremoniously that these little black and brown children those kids can't come into the lab now I don't know about you guys kind of pissed me off kind of pissed me off as a father as a teacher as somebody who's kind of dedicated to helping kids and sit well with me so I decided to do what I always do I figured out you know I'm going to go ahead and try to put together some resources so that my kids had access to these tools so you're probably thinking why go through all of the trouble right school districts are supposed to provide all this stuff to all kids right well I'm just asking the question are public schools going broke I mean you guys pay taxes right right and I'm pretty sure that a lot of you guys are parents and you have an expectation that when you send your kids to school that your tax dollars are being used to educate your children and provide tools that they can use so that they can you know pursue you know positive outcomes problem is is that in a lot of ways public schools may not have the resources that you guys think so you're probably asking where's the money where's the beef so just a little bit of information you know school districts are provided funding in a variety of different ways we have what's called ADA average daily attendance here in the state of California the state provides funding based on an attendance model so when our little boys when our little girls go to school they attend and they are real good about taking attendance at school and if you don't send your kid you get the phone call the automated phone call and you're wondering how come they're so on it because that is directly one of the things that they use to fund their program it pays for the building, it pays for teachers it pays for whatever that you know needs to be paid for however there's other funding streams that are used specifically to buy things like computers or other specific things so I'm going to talk a little bit about one area where a lot of funding was supposedly provided and this was during the Obama administration back in 2009 how many of you guys have heard of ERA anybody ever heard of ERA it was part of the stimulus package that was provided back in 2009 under President Obama and they had earmarked supposedly about $650 million over a period of 10 years to augment technology in public schools now on the surface that sounds like a lot of money right but here's the problem that's $650 million over 10 years which comes out to about $65 million still a sizable amount but this money was made available to any state that wanted to tap into those funds so let's just say just for easy math maybe only 80% of the states decided to go for it let's just say 40 states 80% so you got 40 states buying for a piece of that $65 million annually that they could use to tap in and augment their technology so when you do the math it comes out to maybe 1.5 million just a little less than $2 million per year now let's just say in the state of California the California Department of Education gets $2 million and says we're going to go ahead and make this $2 million available to our school districts anybody want to like guess on how many school districts are there in the state of California anybody want to take a guess this is the audience participation portion anybody want to take a guess who said $1,200 you're just about right at last count I counted $1,200 so it's about $1,200 so you got $2 million and you're going to split it amongst about 1,200 school districts it's pretty ridiculous when you start to think about it but not every district applies for the funds so again you see how a big chunk of money can get really really small really really quick here is where it gets bad it gets even worse okay grant money just means you apply for it and as long as you meet the requirements you get it these are what are called categorical funds okay categorical funds basically means that they are designated for specific categories or specific uses the problem is that there are strings attached okay and a lot of times school districts have to do what's called forward funding you go ahead and you spend your money first to buy equipment, software, training and then you submit your receipts to the state for reimbursement exactly good luck was exactly what I wanted perfect response and here's the kicker beyond that sometimes the state and the federal government doesn't provide 100% reimbursement they'll provide partial reimbursement another example of this would be for special education services to our kids the federal government has certain mandates and they'll say we have to provide these services to children with special needs the federal government reimburse school districts only about 40 45% of their true cost so what ends up happening is that every time a child with special needs comes to a school district who needs occupational therapy they need speech they need all these services the school district has to eat about 60% of those costs which means that your little kid your little kid instead of getting their full share of their average daily attendance money from the state a portion of each one of those students they're getting some of that money to take care of these children with special needs and I'm not suggesting that anybody is at fault that's just how the system was set up so with that being said I want you guys to kind of like keep in mind that politicians are quick to say oh yeah we gave away billions or hundreds of millions of dollars doesn't mean that the money actually got to the classroom so I want you guys to kind of take that into consideration so the next time you hear a politician talking about oh there's plenty of money for public schools we'll ask you know what did he get to where he was supposed to go because a lot of times it doesn't and with this information in mind this is one of the reasons why I have kind of learned that I can't depend on the school district or the state or the federal government to provide those solutions we need help now and so this is one of the things that prompted me to start to work these projects so let's move on to my approach call it the decision or the decision loop or the problem solving loop this is how I approach most problems in life and I think how many guys are coders or assist admin or dev ops you guys probably do the same thing in some way shape form or fashion this is how most people who kind of have a logical mind approach you know problems you identify a problem you go ahead and you look for information maybe you address it maybe you create some ideas from there you start to kind of brainstorm and figure out what you need to do you implement you iterate rinse and repeat okay so this is kind of what I did when I was confronted with the challenge of providing technology for my kids so again I'm going to just talk about my latest project over at Rialto school district at Kucera middle school we were promised brand new computers in the spring of 2017 right those new systems were delivered about ten days ago let me repeat that for emphasis we were promised new systems in the spring of 2017 we got those machines not even two weeks ago okay however we were charged with providing you know certain supports to our students we had a technology initiative that had certain benchmarks that we had to meet and that was put in place in 2017 so with that in mind the science department was asked to incorporate technology into his curriculum they were to teach physics using technology and animation and to make it engaging and all these wonderful things provided any equipment to make that happen because we were promised that these things would be coming so again we weren't given any new hardware and no software to make that happen so with that in mind the science teacher's name is James Gast really good guy he approaches me and says Sam I hear you know a little bit about tech I say yes sir do you think we can figure out a way to make this happen for example so with that in mind we found that the school district was in the process of getting rid of some old computers let me show you an example of what we have these are latitude 2120s these are 32 bit computers this little guy is about 14, 15 years old it has a whopping 1 mega ram 32 bit processor it has probably a little less power than the laptop this laptop has a little less power than say the cell phone in your pocket ok you guys got more computing power on your cell phones than this laptop ok our challenge was to take 3 carts of these little laptops which literally were going to be pushed out of the school and into a landfill to repair these machines and to deal with the windows operating system that was in spot we were told we really couldn't use that and I had no desire to obviously but when we approached the IT department and said hey you guys got any funds maybe for maybe just getting some parts so that we can piece together these old laptops as a bridge solution until the new equipment comes you guys can't get on the network you can't ask for help we don't have any money to help you you are on your own I said fine no problem at all I got this so with that in mind we decided we need to come up with a game plan this was the targeted outcome we wanted enough working computers for a classroom we needed a minimum of 35 working systems ok we needed to run an office suite that would be something like LibreOffice we needed to be able to run a media player VLC was our choice and then we needed some type of CAD or animation application this is what we kind of went round and round and round because we needed something that kids would like and we needed something that was pretty easy for the science teacher to learn because he was going to be teaching class we settled on Blender anybody ever heard of Blender animation program can you imagine the headache and the hassle that I had to get Blender to work on a 32 bit box I used to have a full head of hair I used to be beautiful I used to look just like Richard see what happens anyway so we kind of had our marching orders we knew what we were targeting and what the outcome was that we wanted to pursue so we came up with a simple blueprint we said hey all we got to do is we'll appropriate the computers later for disposal we were going to sacrifice a few machines because we needed parts obviously because if you know anything about middle school kids they tear things up for no reason and once we sacrificed a certain number of machines we decided ok for the 72 58 that left us about 14 maybe 15 machines for spares keyboards all the little bits and pieces screws, crack screens all that came from those donor machines and then we had to identify an operating system that would support those particular laptops then we had to get the apps installed, the OSM the apps installed I should say and then we had to fine tune the systems that goes right so you figure hey what could go wrong easy peasy right, no problem right yeah so at this point we had to explore information and we had to create some ideas because really the biggest challenge that we had it really wasn't repairing the machines that was pretty easy you know pulling apart one of these little guys probably took me all of maybe 10 or 15 minutes didn't have a manual I just kind of figured it out obviously most of you guys could probably do the same thing these are relatively simple machines this isn't the challenge so much we already knew what the limitations of the hardware were so that was pretty straight forward our problem was choosing an operating system so initially I knew that I would be looking for a 32 bit operating system and so I just kind of brainstorm and this is the initial list that I came up with I looked at FreeBSD anybody knows Chris Moore over at TRUELAS he's a great guy and he's always saying hey you know show a little love to the BSD that type of thing I'm pretty sure Chris or somebody from the BSD community will be here good people, love them to death FreeBSD is something I wanted to like I want to like them you know it's kind of like anybody have like anybody ever go like to a family reunion and there's like this uncle that it's just a little different smells like cheese you want to like the guy, he's funny but it's just like you don't feel that connection to him that's kind of what the BSD's are to me I have toyed with them and I like them a lot but I just couldn't get into it and there is a limit in terms of the types of applications that make sense to run a 32 bit platform same thing with NetBSD NetBSD will run on a toaster but it just seemed like a lot of work to make it work and then I got to my happy place I am a self-professed dev-hat I love dev-hat love it, always have, always will and so I thought dev-hat for ARM would work, you know, initially the only problem that I had with dev-hat initially was it wasn't the installs, the fact that the repositories, the library packages a little longer too we wanted the most current apps that we could get or the most current iteration of the applications that we wanted to run but always kind of kept in the back of my mind yeah, dev-hat will work but let's see what else we can do and so I looked at dev-hat what I call a little brother Raspbian how many of you guys know about Raspbian it's an ARM based processor obviously this is not an ARM based processor it's an X86 but at the time Raspbian also came out with an X86 port most people didn't know so I was like cool, I'll install Raspbian and see what happens because at that time a lot of kids at the school where I worked I had introduced the Raspberry Pi to a lot of those kids the fact that I have pretty deep ties with the Raspberry Pi community in Riverside and I participated in multiple Pi jams and you know, done a lot of cool things kind of accustomed to that desktop interface and so I said you know what, I can make this box look like a Raspberry Pi, why not? so initially that's the route we went we went ahead and we installed Raspbian and it was a pretty you know, uneventful install but it's still dev-hat right? the interface was pretty I could run Synaptic on it for system updates train a kid to do that I could get packages that were relatively current the version of Blender that we had at the time was version 2.68 which was functional, it worked so, the 58 machine put Raspbian on them and you know, we felt pretty good about our choice, right? we had a functional lab 58 computers, right and those little carts we also made one cart available as a rover to other classes so that those kids could do work processing because again, our systems were banned from the network, okay when I had to do up-down that type of thing I just took out my portable connection to the internet, right? hotspot it and I would you know, app, get, update whatever I needed to do or download applications that I needed to I didn't have the capacity to use something like Ansible because it was just it was just too much hassle I really had to tweak the individual devices to get the most out of each machine okay so Raspbian initially was the choice and it worked okay for a while and initially these were the benefits to Raspbian it was easy installed friendly GUI based in Devian yeh meh easy package management because again I'm a synaptic guy you could use aptitude, you could go to the command line if you wanted to and it was an x86 port it ticked all the boxes right? but here's the thing you know, you always are on the lookout for something a little bit better something that would give you some more options and the reason was because the downside to Raspbian was the fact that the packages again, not the most current and it's tied to 32 bit Devian now the downside to that is that to my knowledge if I'm wrong after version 8 Devian officially is not supporting 32 bit so you get to version 8 of Devian that's pretty much it so my guess would be that unless the Raspbian team decides to take over and to continue that that I could expect that I'm not going to get the most up to date operating system out of Raspbian given the fact that it's tied to Devian on it okay so this time I started to think in terms of how long do I need these systems to run because they're already old I figured the best case scenario is I could probably get another 2 or 3 years of use out of them since I have a limited use case for this particular machine and we could look forward to getting newer equipment but I had to keep these boxes going for at least a year or 2, maybe 3 years so I wasn't too concerned about the operating getting too you know, longer too the apps to work okay and so in the back of my mind I kept thinking I need to make sure that at least I can run these apps with Office Suite I needed Blender at least version 2.68 in VLC because we actually had some tutorials in video form that the students were using in order to move through the curriculum at their own pace so given the fact I was worried about the software being a little long and too I went to where most of you guys go I went to Distro Watch how many of you guys use Distro Watch it's my guilty pleasure sometimes I'm at work and I'm sitting there and I'm like I wonder what's going on with Distro Watch so I go over there real quick and I'm like ooh something new right and I mean I was a Distro Watch I'm not going to kid you I was a Distro Watch I have one box and all I use it for is if you use an operating system that's based on Red Hat or Debian you know the flavors sometimes it's cool I like this desktop environment I like the way that they tweak this and they tweak that and at the end of the day I kind of knew that whatever operating system that I would try to upgrade these systems to it was probably going to be Debian based or in a bunch of ways I already kind of knew so having visited Distro Watch is one of the different options ok oops let me go back here we go so I looked at these options I looked at Lubuntu we have a Lubuntu person up front and you guys are going to have an opportunity to hear more about the wonderful things that are going on with that particular project there's Mate which is a great desktop environment or Mate some people would say Mate I think it's Mate right for those who drink their tea with their pinkies up and then there is Ex Lubuntu which is a good operating system as well so when I looked at these particular options and I played with all of them I decided that I would pick Lubuntu and the reason that I picked Lubuntu was pretty simple it was a 32 bit build right so I knew it would run on my machines the version at the time that I think I installed was I want to say 1610 1610 I believe currently I think 1710 is available now but I know that Lubuntu kind of like follows for the most part kind of mirrors the main line Lubuntu OS even though that 64 bit but you got the support and the updates and that was important to me because even though these machines weren't going to be expected to last more than two years or so I still wanted the comfort of knowing that the packages that I was pulling down from those repositories were going to be maintained and upgraded so it was kind of a comfort to me so I knew that the repositories were going to be maintained I had a 32 bit build the driver support was essential because with older machines you have to make sure that things work like the video works you have to make sure that sound works when a kid plugs into that particular device especially a 12 or 13 year old they have the patience of a nap and the first thing is doctor come in it doesn't work it sucks and I'm like come on just calm down just calm down so again that was kind of important for my own sanity so again the driver support the OS updates are important tracks main line to Lubuntu and it has a modern and polished look to the desktop and I'm going to tell you something most people have argued with me and said hey you could have went with a Lubuntu Lubuntu is okay but you know kids just couldn't get into that little mouse on the desktop they just couldn't get into it and I know it sounds silly but looks are kind of important people want to feel like they have a modern sleek updated desktop experience and so it's important it was important for the kids and it was important for the instructor who was running the chorus so I said okay we went with the Lubuntu so I pretty much finished my presentation as far as how we made this thing you know come about for many of you guys if you have the mind to do something along these lines maybe your kid attended school where the technology is a little suspect I would encourage you whether you're going to do this for a school or a non-profit organization take it on provide these opportunities for our kids it's important it's important simply because many kids who have great minds will never get the opportunity to access technology do no fault of their own sometimes it's a situation where maybe a school district doesn't have a strategic plan or an IT plan it could be because maybe they had the best of intentions and maybe funding fell through for whatever reason if you can reach out and help a school or a non-profit organization maybe you got a dozen computers for those who are business owners and maybe you're getting ready to get rid of those old i5 machines and you want to donate them and do something purposeful find a club find a computer club find a school find a non-profit organization partner with them and say I'd like to donate these particular machines tell me about your needs formulate a plan and get those machines into the hands of kids who can do something purposeful with them again I don't want to prattle on you guys probably have much better things to do but what I have here now I always make a little ask me anything time if you guys got questions or concerns or anything feel free to ask at this time if you can't think of anything you'd like to know that's my email address Dr. Samuel Coleman at gmail.com that's my cell number feel free to call me if you have questions or concerns people have asked me to participate in a variety of projects and a lot of times I'm more than happy to do that so that's the presentation I think we've got a couple minutes left if anybody has any questions or concerns raise your hand at this time yes sir thank you I used to do the same type of thing for the rescue mission helping the homeless only I had a mobile lab so how do you find a place mobile is okay but it doesn't really work all that well as you well know setting stuff up how do you find a location where a building is schools lease out what would you recommend that's a good question and it kind of goes to your situation but I'll give you some ideas about things I've done in the past now early on I I used my garage and there was a time when I had maybe like 60 PCs that I was repurposing I had a group of students that I would pick them up and on the weekends we would be in there and I'd buy pizza and Coca Cola and you got a bunch of teenagers wrenching on these little boxes and what not and in an effort to stay married I decided I decided that I was told by the boss 24 years in I was told by the boss that she said hey love you love you too honey you want to stay married? yes dear he said you need to find a place to take this crap so I can park my car to the right so I would say this if you have a very understanding spouse then maybe you can make it work by storing the equipment there the other option would be this there's a few if you partner with a non-profit organization sometimes they'll actually have storage space that they can allow you to store equipment really you need two things you need a place to work on it which means you're going to need at least a small room with access to power and the web ideally sometimes you have to do things in piecemeal and it depends on the scale or the number of systems that you're going to be repurposing when I started I was doing like 5-10 boxes at a time so I could bring some things home do a little work on them stick them back in my van take them and drop them off to where they needed to go at one time I had a project called the PC Reclamation Project and I did that over the course of a couple of summers we repurposed close to 300 and just under 400 machines and so yeah and so I had the help of seven student volunteer and I was proud to say that these kids really worked hard but it became a logistical nightmare because people who wanted to donate equipment we literally had to tell them okay we can pick up your equipment in a week or two weeks out or three weeks out so there's a couple ways you can do it if you are partnering with an organization that has space just ask if they can provide you a closet to store the equipment if you're going to be doing the work on premises then you're going to need something like locking file cabinets because if you let's say tear apart a machine you got power supplies you got cases you'll need a place to secure that if you're doing it out of your garage and your spouse's understanding that could work I don't advise it the other option would be simply to enlist those individuals who are partnering with you and providing the labor so to speak ask them if they would be willing to store five or six machines you may have to be centralized the worst case scenario if you don't mind ponying up 30 boxes of storage and I've done that too and that can become that can get expensive so it really just depends on the scale the project how many machines you want to repurpose and the flexibility of those people that you're working with so question yes sir I have two questions if you don't mind I'll just ask them and you can answer them at your leisure first of all what's next in the life of these 32 bit latitudes that you have the second question is in addition to learning about these these applications that are free and open source software obviously what else did the students learn if anything in regards to open source and Linux and the like okay I'm going to answer the second question first as far as what the students learn learn things that we were really focusing in on with the technology we had a technology plan a district technology plan and it was basically each side had to kind of decide what they wanted to pursue the science department wanted to integrate technology to teach certain things within the curriculum so we just happened to pick blender because it had animation and elements of physics and that type of thing so the kids were exposed to that particular application but in reality the platform became less important because the kids were running the app and I think for a lot of people operating systems are like plumbing nobody cares to see it nobody cares as long as it functions it should just kind of disappear into the background so one of the things that I've done as I've worked with students at my school we'll do presentations we'll talk about open source software we'll talk about the history of it we'll bring in things like Raspberry Pi computers maybe x86 systems that say hey, these are the wonderful things that are available to you and so the kids that I've worked with about 150 of them got a little taste of what open source was and how it impacts them and so it's just something to expose them that was really the idea now with regards to your first question the 32 bit machine so to speak I am of the opinion that those machines should be treated like old Volkswagen you keep patching them together until it's just not viable to do anymore it doesn't cost the school district anything to keep those things going I think really in a perfect world I think that there should be a blend between systems that are controlled at the local level and what I call the consumptive devices like Chromebooks or iPads or whatever that provide a certain function the problem that I'm seeing in technology and education is that the IT departments are making decisions based on what's easy for them to maintain and monitor and the thing is is that for many of you or at least maybe some of the more seasoned individuals here when you thought about maybe your experiences, when you first discovered computing maybe you had an old heat kit that you put together or maybe you had an old Commodore 64 with the cassette drive right? anybody still got theirs? I still got mine I got a 1541 drive but here's the thing back then we weren't just consumers of content we were interested in using Byte Magazine and typing in those programs so we could learn how to code we were producers I think that IT departments and schools unknowingly are creating consumers of technology but these kids are creating anything they're not learning coding maybe they're learning scratch which I have problems with that's another question, another concern so I think that for many of our kids they're learning to consume content I think with those old 32-bit machines some of these kids got a chance to create content because we put Python IDEs on those systems so some kids got exposed to that those opportunities aren't as easy to make available on a Commodore not to say it's impossible but the IT departments are thinking we don't want these kids to have too much access to these machines because it's going to create a headache for us so the thing is the interest of the IT department and what we're trying to accomplish as teachers and as parents is diametrically opposed IT departments won't control teachers want the ability to enhance learning to create opportunity and to spark a kid's intellectual curiosity and so this is one of the reasons why me personally I want those 32-bit machines to continue to be available because I can say hey Johnny experiment with this and it tears it up so what right? like I said the worst case scenario is this little kid, little Johnny or a little Martha, they might learn something hope I answered your question yes sir actually I've got a similar story up in high school I was in the special ed department the IT department considered us a fifth world country they wanted nothing to do with us brother you need to come up here they would give us no computer equipment they would give us no technical support whatsoever we were running Apple we had a lot of news when everyone else had Windows XP so one of my jobs was we got donations from large corporations in the area Harbor Frey, agro-men et cetera but these machines were old 486s or barely pentiums and my job was to install any operating system with a word processor and have the ability to transfer documents from that machine to a printer or to another computer where it could be printed and that was my job from 2002-2004 so it's a similar thing I ended up having to jump and figure out what operating system would be the best and I ended up choosing damn small Linux at that time oh I got you a better DSL tiny core tiny core didn't exist then this was before the damn small Linux and tiny core split actually I played with damn small Linux, it was really good OS I enjoyed it, it was really Debbie it makes yeah I just wanted to bring up this story because it's something that does happen the Linux chicks LA chapter does this they take old computers and they set them up in schools in third world countries as well so thank you any other questions we have that's a good question now here's the thing there were OS's that were older that ran better but here's the dance the older the operating system the less likely you were going to get support updates patched that type of thing and the other piece of it is that the repositories that those operating systems could access they're going to have old software that they can't really you know address our specific needs so you kind of had to you had to wait in the balance I wanted the most current operating system that I could with the most current applications that I could run you know but also keeping in mind that I had limitations in this hardware because just because you just because you can meet the hardware specs doesn't mean that that's going to be the optimal experience so it always required me to always go back to distro watch and say let me try this one let me try that one I'm always in a constant testing state to see how can I eat every little cycle out of this little processor because again I only had about as much processor power as a cell phone in your pocket so yeah oh is that me uh oh I don't need this anymore I don't think any other questions guys sir yeah so the question was he was saying what do you do when you have old hardware and you know you're at a point where that hardware is just you know so antiquated this is the reason why I love distro watch distro watch is a huge resource for me because you can actually do a specific search for operating systems that are optimized for old hardware I'll give you an example um let's say for example that someone says hey Coleman I have 10 46 DX 100 and this actually happened yeah I have a couple of options to make those systems work depending on my use case okay my use case dictates how I'm going to use those machines and if they're viable or not so if all I really want to do is do word processing on those I could take the hard drives out configure a USB thumb drive and I could use something like portable or perhaps on that box if I really wanted to right if I'm really wanting to be a masochist I could probably do something like I don't know what's that got Patrick what's that old operating system that got Patrick ooh don't even say it yeah I could run slackware or slacks or something like that on an old box like that if I really wanted to so there's options everything use case think of an old computer the same way you think of an old car okay if you got an old Volkswagen bug 1966 sure you probably don't want to drive it from here to New York but could you get back and forth to work in it probably is it going to be a little bit of a pain to adjust those valves probably but if someone gave you the keys to it you probably wouldn't take it to the crusher because again with some elbow grease and you know the idiots guy to repair W's you could probably figure out a way to make it useful and so again I would caution anybody before you throw out a perfectly working computer think of a use case that makes sense go to distro watch do your research and do a little testing to see if it makes sense because like I said maybe that poor kid next door would see that as the world's greatest tool and it might put this little girl a little boy on the path to become the world's greatest coder you never know any other questions guys what do you see as the biggest need we just need to expose kids to technology and I mean not just to consumptive devices we need to expose kids to technology and what I mean by that is is say hey look at some of the cool things that we can make that's the problem when I was a kid you know I was curious about things I took things apart I made stuff I think one of the worst things was that a lot of public schools they took wood shop and metal shop and auto shop out of the schools kids don't have the opportunity to get their hands dirty like they once did and the nice part is is that with technology you don't have to like have grease under your fingernails it's a pretty clean thing so I think that it's important that we just expose kids to technology in the ways show them that they can make things that they can tear things apart give them an opportunity to really interact with the technology rather than just consume content on a platform if you guys have any other questions I'll be around feel free to you know just pull me to the side anything I can do to help you let me know thank you Dr. Coleman it's nice that we had Dr. Coleman with us today we like to have at least one story for education thank you very nicely thank you for that we have a 30 minute break after which we come back here and Der Hans will be talking about software management for Debian and Ubuntu so we'll see you at 11.30 check one two check one two check one two check one two check one two check one two check one two check one two check one two ok so when you do AP T Cash Search when you search for a package you get the one line right and then when you say AP T Cash Show show me the information you get the full version of it so you get multiple information both do a good job providing them in multiple languages so you'll see2019 packages depending on what you're setup my packages are usually done in German Scriptures to keep that in mind as well I like that we have a description MD5. Now MD5, we only have one, and we only have an MD5. We don't have a SHA-1, SHA-256. Do we need to worry about security on the description? Nah, it's a bunch of text, right? But the cool thing on this, for me, is that this, and this, I think this has been for translators, is they don't have to go through and look and go, did it change? They need to check the MD5. Is there a new MD5 for that description? If there is, let's go do a diff and see if I need to go through and update the description in my language, right? So you've got this way of double checking inside, which I just, it's not really that useful for us, I use it, I think it's awesome. Task, we'll talk about tasks a little bit, build essential, and then support it. I like the fact that the package shows you what the support is for that particular package. So let's talk about some different tools. I'm not gonna go through all these. Any one of these could possibly take an entire day, a retired session here. But if you're using bash as your shell, you can type apt-tab-tab, and it will show you a list of possible commands that could go through and complete that. So apt-add-repository, apt-url-gtk, those are all commands that start with apt. So as you're getting to learn more about Ubuntu and Debian, do this once in a while and pick a new tool that you're not familiar with that looks like it might be of interest, and go look at the man page. Go search for it and see if you can find some online documentation on how to use it. I will bring up a couple as we go through an example. Also, once you're kind of familiar with the ERAPT stuff, also do a d-package tab-tab, dpkg, Debian package, and there's a whole lot of, again, useful tools under the hood. And there's a ton that don't fit those two patterns, but this will give you plenty of entertainment for a while, you know, when you're giving learning things about stuff. One of the things I really, really love about d-package is that I've been using Ubuntu, or Debian, in Ubuntu for 20 years, or since before Ubuntu existed, and I have not needed to learn that much about d-package. It is very arc, it can be very arcane. It's an awesome tool, does some great stuff, but apt has done a fantastic job and hidden a lot of that complexity. If I was maintaining packages, I would probably have to get very familiar with a lot of these things. But it's just assisting in as an end user, as an occasional developer. A lot of that is hidden from you, it's awesome. There are a couple d-package things that I'll bring up that are kind of useful. So doing upgrades, fairly easy. You do a sudo, I'll cover sudo later on, but sudo is to change user. So as your normal user, you don't have authority to go through and add new packages to your system. So you can do a sudo apt update, which says go get a new version of what the available packages are. So this is getting all the package lists. I mentioned the security repo and the updates repo and the main repo, so it goes and grabs those lists so that you can find out what packages are available on the repositories today. With, if you stay with LTS, they don't change all that much, you're fine. If you're using SID, what was available yesterday might not be available today, because it really broke things and people yelled and it got removed, right? So do an update before you do anything else, in most cases. And then you've got a sudo apt full-upgrade. Now this is replacing what we used to call this-upgrade, both still work. I don't know if this upgrade is gonna go away because a lot of us have got finger macros. My fingers just type that and I don't even think about it. But full-upgrade, they wanted to change the name of it because it made it a little bit more obvious what it is. And I'll get to that when I talk about upgrade, the next thing. So we have upgrade and we have full-upgrade. What's the difference? So upgrade will only grab new packages. So let's say that, again, going back to foo. I have package foo installed. There's a new version of foo that has been split out so that the libraries are in a separate package. So I now have foo and foo live and I need to install both of those in order to have package foo. If I do an upgrade, it won't upgrade, it won't update that because it won't install the new package foo live. It will only grab a standalone foo. So you're only updating things that don't have dependencies for something you don't already have installed. It is useful in occasions, but I'm glad that they still have it, but for the most part, what we'll use is the full upgrade and that says grab all the new stuff related to things I already have installed. If you've never installed foo, it's not gonna just go, hey, I think you should try a package foo. Let's go through and add that, right? I'm certain we can create a command line tool that will add that type of stuff, but it's not built in, right? So what it will do is say, okay, you've got foo. Now foo is dependent on this foo live that's been split out. I'll grab both of those in order to do the update. Same thing for patching any other package you've got installed. So it'll get all those dependencies from all the different repos that you've got to configure. Now, this is a lot of go-to. There is one thing that Red Hat does better, and this is one of the few things I will say, I'm definitely debbie down. Red Hat makes it easier to enable and disable different repositories. It's very simple to do that on the command line. In Debian land, that is not so simple. So you need this really long string. Please don't try to write it down. Go grab it from the slide just for one of my blog posts. And what this does is say, put your security updates in one file. So I move security out of the main file and put it in a new file, and then comment it out so you don't have it in both places. And then on the command line, disable all the other repositories, disable everything, disable this and disable that, including things I didn't even know that you needed to disable. And then go ahead and add the security thing back in. But what this allows me to do is to do an APT full upgrade and only get security updates. So if some package has been changed and I'm getting ready to do a presentation, you guys saw, I want the thing that worked and it's bad when it doesn't. So I try not to change things that I don't absolutely necessarily need to change. Same thing for my production environment. They go through and change the man page. I don't care. I don't care if I have the old man page in my production environment. In fact, I probably don't have man pages in my production environment. So this allows me to just see the things that have security release issues. And it also allows me to see a smaller list. So for years and years, I was getting updates, you know, security updates for X drivers for hardware I didn't have. They were updating the whatever chip driver. And then my laptop doesn't have that chip. I don't need to worry about that security problem. So I didn't need to go through and make those changes. I like to be able to control that, but I have to admit, Debbie makes it a little bit difficult. All right, release upgrade. So you've got, let's say you're running 16.04 and you want to move to 18.04. Actually, let me change that because I forgot to change the order on the slide. So let's say you're running stable and testing is frozen right now. And testing releases, let's say, testing releases Monday and you want to update, right? Don't update while you're here, you should be enjoying scale. Well, let's say you want to update Monday, you got home, you're like, oh, this is so cool. Let's go through and move to that. You go through and do a full upgrade and it does an upgrade to the new version. Debbie and we'll stay with the version you've got in there. So you need to go through and change your sources that list files and say, okay, now I want to move to, I forget which word we're at, to the next version. And then do your APT update and your APT full upgrade. And that will get you the new version of it. For Ubuntu, they have a cool tool called do-release-upgrade. It's in the update Ubuntu release upgrade or core package, not installed by default. And it's not an obvious package name that's the tool that you want. So I put both of them in here. Do the release upgrade, we'll then go through and take care of all that stuff for you and go through and do the upgrade. It also asks you multiple times, are you sure? This will take multiple hours. It will use up a bunch of your bandwidth and a bunch of your space. Do you really want to do this? And I believe by default, I forgot to double check. When the first version of a new LTS comes out, they don't default to getting the new version of it. They wait until for a point when release. You can override that. And it also won't default by default from LTS, to show you the short-term releases. So when the testing comes out right here now and becomes the new short-term release, it won't automatically say, hey, let's go through an upgrade to the short-term release. You actually have to override a couple things to do that. Again, look at the documentation. Simple package install. I've been talking about them, like how do we get them? So an APT install and then the package name. Again, we need the sudo to get root privileges. And if you didn't want all those recommendations. So one of my favorite tools is ASCII Doc. I just wrote this presentation in and I converted to other things. I mentioned it to a friend of mine and he's like, oh, I'm gonna try this out. He went through the APT install ASCII Doc and it proceeded to want to install 14 trillabytes of things. It's got all kinds of dependencies and recommendations for different stuff. Well, if you're not wanting to convert from ASCII Doc to some Vax man page, you don't need the Vax man page script. So if you just do a no install recommend, suddenly it just installs ASCII Doc and a few other tools instead of going through and doubling the size of your re-estat that bad. But it's a lot for small tools. So that's one way of changing what you're doing. You can also change your configuration overall. I mentioned searching. So APT cache search, web server APT cache search, browser, whatever things like that. So you can use keywords to do that. As you get more familiar with the packaging system, you'll find better keywords that work. Package info, APT cache show, that's how you get the package info, the long show, the piece that I showed earlier. And then removing packages. So you install the package and you decide you don't want it anymore just because you don't use it and you want to free up space or it pissed you off and you want it to go away or you were experimenting, you're done with the experiment, it's time to go. So APT cache remove, or APT remove, will go through and remove the package, but doesn't remove your local configuration or your local data. So let's say you do APT remove MySQL server. It will go through and remove the software for MySQL, but it won't remove your configuration file and it won't remove your database file. So that you can do it for one of the reasons you might need to APT remove occasionally so you can install the new version because there's some conflicts and stuff like that, right? So you can remove it and still update, or for instance, perhaps you're moving from the MySQL package to the MariaDB package or the Percona package. So you can safely do that and keep your old data, but if you're like, I'm trying to save space and it's using up way too much of art, then you can do an APT purge and that will go through and remove most of that data. If you're doing that, you might also want to make sure you know where that temporary data is done and you can do an audit to make sure you've gotten rid of all the data you want to be able to get rid of. You also will see sometimes this prompt, it says, hey, you've got something that was installed automatically that you don't need anymore. So let's say, for instance, I had done that Foo update that now requires Foo lib and I'm done with Foo and I APT remove Foo, APT purge Foo, right, and get rid of it. Well, that doesn't automatically get rid of Foo lib. Foo lib will still be installed, but I didn't tell the system to install Foo lib. The system just noticed I needed Foo lib and nicely went through and installed it for me, right? So the APT auto-remove is a way of saying, oh, hey, here's the stuff that you've still got installed but the things that require them aren't installed anymore, you got rid of those. So you don't need these anymore either. And that's what happened here in lib hun spell, whatever that is. The system noticed that whatever tool had required this particular version, it still had lib hun spell, this different version. I don't need this anymore. I can run APT auto-remove in order to get that. Now, can I just run APT auto-remove? No, I need root privileges so I need pseudo APT auto-remove. All right, APT cache search package, oh, and to install a specific version. So inside that, the information for the package, one of the things I pointed out is you get the version. Sometimes you need a specific version for some reason and this is a way of doing it. Find out which package has that version and then you can do APT install the package name, the equal sign, and then the version number that you're wanting to get. So if I wanted to go back to getting lib hun spell dash 1.4.0, I could get the version and add that in there, all right? So here's an example of that. So I had a couple of different versions of Firefox. I did an APT install and said, okay, go grab this particular version. The package system gave me a notice. Hey, by the way, you're installing an older version. Your package will now be downgraded. That might be why you're doing this, right? The new version broke something and you need to go through. ESR actually was a Firefox example of that where they moved to the new web extends add-ons and I wasn't ready for that. I just went through and figured out how to get done what I wanted to get done and moved up. But I could have said, okay, install the old version of Firefox instead so I can continue using add-on and I will get to the new stuff later on when I have time. All right, it is also possible to both install and uninstall at the same time. So here we can have the minus says to go through and uninstall and the plus says to install. So you put a minus after the package name to uninstall and a plus after the package name in order to install. So there's a way of doing this on the command line. You know, in the decades I've been doing this, I've only needed to do this a couple of times. And the last time, actually I think I needed to redo it once in the last five years and I'm pretty certain it was my own fault. 15 years ago we needed to do this once in a while. But it depends and recommend and stuff have gotten much better testing. So I haven't heard of this coming up in a while for a long time. On debuting. And we'll talk about, you know, another, come talk to me afterwards and we'll talk about other packaging systems needing, really, really needing that tool. All right. And then to reinstall, APT get makes that easy. There's a command line switch for it. APT does not have the command line switch for it. So you have to add a whole bunch of stuff that you want to go through and reinstall. Now why would you want to reinstall? The most part you don't. But occasionally you want to reset the configuration file or the install didn't go away or you ran out of space during the install and things are kind of higgly-piggly. This is a way of doing it. Again, I haven't run into this in a while in a long time because generally other things break first and they just take care of it. Now also, there are tasks. If you remember back on the package information, they mentioned what task APT was in. So you can go through and choose different tasks. For the most part, we don't look at these, but there are certain things that are kind of useful. If you want to install a mail server, they make it fairly easy to go through and get a mail server and some associated packages, installing an open SSH server, some different things. So look at task cell and see if there's anything that looks interesting for you. There are two or three that I use on it. Actually, I think I use every time. I've got a baked into the script, so I don't look at it. Things in the file system. So APT, etsyaptsources.list is your main file that lists where you're gonna get your packages. That's where your repos are listed. There's also a sources.list.d. So that's where I put my security stuff, right? I put it in sources.list.d and pull it out of sources.list. There's also etsyapt, aptconfs.d, which is where you can make configuration changes to your system. And I point out a specific one that was necessary recently or useful. You could add this on the command line, but you can also just go through and add it to a file. And what happened is I mentioned, because we have signatures on our packages, we don't need to worry about downloading them securely. Well, it turns out, there was a blog in APT that allowed somebody to go through and cause a man in the middle problem. And so we needed, in order to get some updates, to turn off that particular feature long enough to get it fixed. Because if you just use APT updates before you got the update, you could get the man in the middle problem. So I went through and just added this to all my systems so that every APT update I get for a while will automatically have this fixed in place. And that way, if somebody goes, oh, now that I thought that you could do that, here's this other way of doing it. Oh, and here's this other way of doing it. Well, we'll get the updates in there for a while, and then sometime in the summer, I'll remove that file. All right, configuration files. One big thing about configuration files is two things. It's where your local stuff is. Debian tries to ship with default configuration files that make sense. So if you install a mail server, you get a mail server that comes up and running. If you get a, install a web server, same thing, that it by default just works out of the box. And then they have a very strong thing about don't change in other packages configuration files. All right, so Debian configuration file requirements preserve local changes. So if local system in has made a change, the local owner of the box they made a change, don't change it out from underneath them without permission. Preserve the configuration files on the removal, delete if they're purged. So these are different things that they have on there. The last one is, like I say, is really important. Don't touch another packages configuration file. They just default to something generally useful. Now, one of the things they can do though is that they say, oh, the configuration file in this new package, new version, has changed what you want to do about it. For if you do auto magic installs, it just defaults to don't change it, you know? But if you're installing when you're looking at it manually, you can go through and say, yes, no, you can go through and look at what the differences are, and they give you several different options within the tool. If you're installing automatically so it can't ask you those questions, it will go through and drop the new versions inside the Etsy. So this is a command line to go through and find out what some of those are. So you can see if there's packages where you want to go through and address configuration changes that take place. There's also a thing called pinning, where let's say that for Firefox, that example, I didn't want to move the new version of Firefox. I could go pin that version and say, this is the version I want to keep installed until the time I'm ready to change things. So it doesn't matter how many new versions of Firefox come out, I'm sticking with the version I've got installed. And so we can use pinning on that. Again, pinning can be an entire talk because the dependencies and stuff, so I'm not gonna go over it. These are tools that you can use for it. Again, grab my slides afterwards, they will be online. All right, pseudo APT mark hold. This is another way of doing that, that's saying just keep this version of this particular, this package of this particular version. So I could also do that and say, okay, for Firefox, whatever version of it, I'm gonna put Firefox in hold so it won't get updated if new versions come out. Different way of doing this, the same type of thing. Pinning is also good, I forgot to mention, for repositories. So for instance, if you've gone through and you need to pull something from testing, you can go through and use pinning to make sure the stable is where you normally get stuff from and you only get things from testing if you specifically ask for it. All right, replicating installs. So d package dash dash get selections will tell you what packages you have installed and if you pipe that to set selections on a different box, that will set the selections you're gonna go through and get. I don't give you the full command line to go through and do all that because I want you to go learn a little bit more about how this works before you go through and replicate it and stuff. So there you've got those. d package dash L, list what you've got in there. And there's also a tool called APT clone, I've never used it, but it looks like it would be useful as well. Fixing work. So occasionally we have packaging problems. Occasionally we have issues. The main ones I run into is when I run out of space, especially if I get too many kernels involved and stuff like that. So APT dash dash fix dash broken install is a way of saying go through and do an install and APT install but fix the things that are broken. And you might also need to do a d package dash reconfigure dash a to go through and reconfigure things. So let's say you find a file on your file system and they're like, I don't know what this is from, right? It's in barlib something or another, but what do you do release upgrade? How did I find what package do release upgrade again? Well, I use the d package dash capital S and then do under bar release under bar upgrade and it told me what package it came from. So this is a way of searching for what package owns a particular file. If it's a dynamic file, like configuration files, some of them get created by scripts, those aren't gonna be listed from d package dash S, but like your barlib file, those should all be able to come back with a result. And then d package dash capital L will list the files in the package then. So if you d package dash capital L APT, it'll show you all the files that the APT package installed. All right snaps, I promised I would talk about snaps as well. So snaps are somewhat distribution neutral and I say somewhat because canonical and Ubuntu went through and created them and enabled them inside of Ubuntu and they've also worked with developers with Red Hat and other distributions to get support. But from everything I've seen, they haven't like really actually worked with Red Hat, they just worked with people who worked under that. And they've done some of the work to make them, to keep them supported, but it's mostly an Ubuntu thing at this time, right? One of the features you get with that is auto magic updates. So the new version of if you're using the Firefox snap, the next cloud snap, I see those things up all the time, then the new version comes out, you automatically get the new version. I think it falls to like four times a day, it checks for new versions and they'll go through and install this for you. They're easier for developers because developers don't have to learn about how Debian does stuff and how Ubuntu does stuff and how Red Hat does stuff and how Mate does stuff and what's the difference between Debian and Ubuntu and the difference between Red Hat and Fedora and all that. So they just have this one thing. So this does make it a little bit easier for developers. They're also self-contained and they're containerized-ish. So the snaps are trying to put things in a container, they're not, you know, we're having container days for the next couple of days where you're gonna learn about, you know, Kubernetes and LXC and things like that where you're making actual containers. They're not doing that, but they are putting in security rules and some other things such that the snap package can't change files from other packages. So they are locking them down a little bit, actually more than a little bit, they're locking them down fairly well, but it is not a fully, fully isolated system. And they're really good for like IoT and Internet of Things, kiosk, kiosk, things like that, things that aren't gonna change. What they're not good for, and that one's still in the room, all right. So, what they're not good for, sorry. What they're not good for are systems of production, right? You don't want things that you don't wanna change without you saying that you're gonna change. So one of the problems they have they are automagic. They will go through and do automatic updates and you don't have a choice about that. So I found a post about a guy that does firewall rules, that's what his package does. You don't wanna reset your firewalls every time he comes out with a new version of it. You wanna reset your firewall when you're ready to reset your firewall, all right. And I had a couple other slides about snaps that we're just not gonna be able to get to and it's about time. So I did want, so the slides will be available online and we usually link to them from the scale sites, but I did wanna specifically think Brian and Richard because they asked for this talk. So Richard asked me to do a talk for Ubucan, I appreciate that, and he'd been asking me to do some Ubuntu-based materials for a while, and Brian had specifically asked about some of the other more esoteric tools and things that you don't see and because of my power outage, I won't get to the tool he specifically wanted me to ask about or talk about, which is D-Packet Divert. I will go ahead and go ahead and bring that up. So D-Packet Divert is a tool he uses and one of the things about that is you can, when you install the package, instead of installing it at the normal place, you can tell it to install somewhere else. And so that if you get another package update, it won't wipe out what you've got in place, and in his particular case, he then created a wrapper that would then call the other tool. So he's like, I wanna call this tool, but I want it to always have this configuration file, it's configuration option. So he needed D-Packet Divert to put the tool elsewhere and then created a wrapper script that would always call that tool with that configuration option. Do we have any questions other than, are there more ways of presentation can go back today? Yes, sir. Yeah. Yeah. I'll be here for a while. Yeah, sir. Yes. Yeah. Okay, thank you. I've seen that, but I've never looked at Wadget, so I'll take your questions in reverse and see if I can remember all of them. So the W-A-G-I-T I've never looked at, I have seen references to it, but I haven't looked at it. For the examples, yes, I hardly agree. One of my favorite tools is Proc Mail. Aside from the, I could just get Proc Mail and grep for everything, my life would be fantastic. One of the things I love about Proc Mail is it has a Proc Mail EX, which is a man page specifically for examples. So it's got a Proc Mail RC, which is configuration file. It's got a regular Proc Mail and Proc Mail EX with a whole bunch of examples. And I still love that. And again, I'll look at it once a while just for fun, because it's fantastic. And then the first question was dependencies. So it'll show you the dependencies. There are times when you have conflicting dependencies so that you're trying to do something, something has happened, and either you're trying to install two packages that conflict, or the system is, because there's something broken and how things are done right now. And so it gives you a warning and says, I can't fix this for you, you need to figure out how you wanna fix it. Because either way, I'm gonna break something. Regardless of what I do, you need to decide what I'm gonna break. Or wait until this is all upstream if it's a system issue. So yeah, yeah. Okay. Okay. I remember seeing that string somewhere, so I'm presuming I've seen the tool, but I don't know anything about it. So I can't speak, yeah. Oh yeah, okay, good point. So coming back to an earlier question. So the info pages are generally, are often maintained better than man pages, and there are some that are more up to date. There's the date, the info page for date is more current than the man page for date, as in like a decade. They still haven't added some stuff. Yeah. The other thing is, you're blind, so a friend of mine, we got to do a presentation here a couple years ago about being blind and using Linux. He does a lot of things using Emacs, because he gets Emacs speak, and then there's a lot of other things where Emacs, your fingers never need to leave the keyboard, right? Type of thing. And Emacs is much more dialed into info pages and does a much better job with info pages than man. If Emacs is the direction, the tool you're using. So question them then. Mm-hmm. Sure, so appcast search, we'll search more fields inside the package. So APT search, we'll search the package name, the one line description, and a couple of other things. But a lot of times you're looking, what you want is then the long description or something like that. So for me, I would rather get more results and whittle it down than less results missing the thing I was looking for. And some of it depends on who you are and how you do things. I haven't seen anything. There are people that do an APT, pseudo APT update and and APT full upgrade. And you could add that to a script and just call your script. Or you could just type it all out on the command line. I used to do that actually for a while. I would just do that, but, yeah. So look at ProcML EX, so man, ProcML EX, I always want man and space and then ProcML EX one word. And that has a lot of really good examples. I haven't, at least in a long time, done anything where I'm specifically saving attachments. But I know there are things in there for splitting the message up. You might end up needing to use Foremail in order to do that. But the example page probably has something that will get you going the right direction. Okay. Okay. Yeah. Yeah, and look at ProcML EX and see if that gives you what you need. And if it doesn't, ask on a, join a local Linux user group or something like that. And after that, you're probably gonna have lots of people like me that just love ProcML and might go after that just for fun. Brian, you're nodding, they do? Okay. I don't know. I haven't noticed it, but also Debbie is stable is a couple years old. So it might be that it's in frozen and coming out to the next one. I personally am not a huge fan of Snap for a couple of different reasons. One of the things I didn't, that my slides had, as I say, you get the automatic updates. One of the problems I have with Snaps is it's nice that, for instance, you know, NextCloud, you get the new version and NextCloud, if there's a security fix, you get the security fix. But if NextCloud depends on some library, do I know that NextCloud's getting the new version of that library? So a number of years ago, we determined that we had a half a dozen different places in the kernel that had a security problem with how they were using GZIP. And there were two issues, one with, you know, you got over the first thing of, there's GZIP implemented inside the kernel. And then there's six different implementations of GZIP in the kernel, right? It's just using the same one. But there was a security problem with GZIP. And now the kernel was susceptible to it. And so for like three months, we kept getting new kernel updates as they, oh, there's another version of GZIP that we need to go repair, right? And so I like, when I go install foo, and foo depends on libc, and if there's a security problem in libc, we get the update for libc, and then foo is covered as well because you've already gotten that. Whereas if I'm using a foo snap that has its own version of libc, I don't, you know, I've been around long enough where I've been down this road before. And I've seen packages that keep old versions of libraries. The kernel be one of the, because you can't, kernel can't call libraries. They have to be embedded. So there are some systems where you have to have it embedded. Libc, the primary thing, you know, it's hard to go through and get things, right? And I'm worried about those dependencies being maintained. So there might be a way of escalating. We ran into that recently, there was an escape. The, so that your classic has less of the, you know, turns up some of those things. You also have to go through extra approvals to become a classic version. So Ubuntu does give you a review of it. So another thing with snaps is for Debian packages and Ubuntu packages, I know that those maintainers have reviewed those packages and the changes are in there, right? They haven't done a full security audit, but I know two people have looked at it, right? At least, you know, well, to some extent, because sometimes the same person upstream is also the packages. But you've got a little bit of extra warranty, you know, implied warranty with it that way. For the containerization, your default for snaps is they've got them container, the idea is to make it to where they can't escape, right? Whether or not it's there, I don't know, but the idea is to limit that. Reminding me of something else. Upstream, painters, now I forget what the other is. Reminding me of something else I wanted to say so I don't stop, so I don't remember what it was. Oh, actually, you know. Ha ha ha, you guys can't see it, but I can. Everybody gather around the other laptop, that they have updates. Oh, the other thing for snaps is that Ubuntu is requiring, or Canonical is requiring a CLA on that. So to me, that's an inhibitor for Cintu leading upstream. So stable. Yeah, so strict is default that you were talking about where they've got things locked down. Classic requires approval. There's also dev mode, which can't be released as stable, so they're allowing developers to have the playground, but then you can't release that to the wild. We're trying to anyway. Okay, and we've gotten through most of the stuff that I wanted to get to. Any other questions? Well, thank you, and thank you for sticking around towards the end, appreciate it. Check, check. Hey, we've got audio. I assume I could probably yell and everyone would hear me, but you know, just courtesy, right? All right, so while we're waiting for people to like trickle in, anybody wanna have a conversation about what this talk is about? Is this recorded? Okay, who's doing that? Raise your hand. Is this recording it? Oh, okay. Oh, just counseling straight. That's cool. Awesome. All right. Okay. I'm Dave. Hi. Yeah, I'm excited. I was here last year and was kind of disappointed that there wasn't talk like this, so hopefully. Yeah. Right, right, so that's right. And I figured this is the right place to take it, so. George, nice to meet you. I'm Dave. So if I could get a show of hands of how many people know how to program in a programming language. Fantastic. All right, that's exactly the kind of people I wanted to have here. Who has, who, is there anyone here that doesn't have Ubuntu or hasn't installed Ubuntu or at least like has a machine that's running Ubuntu like everyone interacts with Ubuntu, right? Okay, good. I had someone ask me about this this morning and I was kind of like, well, if you're not running Ubuntu anywhere, this is probably not the best talk to start with. So anybody have an idea as to what they want to get out of this talk? Kind of get a show of hands. Yeah. That will be basically that is the talk here. Yeah. In terms of packaging packages that are completely new, that is not what I was expecting people to want here, but I do have a few links in my slides that will cover exactly what you want from that. You had another thing? I had another hand, any other hands? Make it be official. So it's been sitting in Launchpad. Ah, ah, so you wanna do a universe inclusion request is what you wanna do. I can give you the buzzwords so that you can Google to find the right Wiki page. It's probably the best way to do it. I've never actually done a universe inclusion request, but that probably, that can be done. I can hopefully handhold you a little bit afterwards. This probably is not gonna cover that use case as much. Basically basic Debian packaging, but in doing the process of doing a universe inclusion request is probably outside of the scope of that. Anybody else have ideas as to what they wanna go out of this? That I will cover, that I will generally cover. Yeah, so good, you're in the right spot, fantastic. So it sounds like everyone is kind of figuring this out. We're gonna be in, you're all pretty much in the right spot. The priority that I'm gonna be covering here today is kind of, we all run up into you, we all hit problems. And my goal is really gonna be to kind of when you go and hit those problems and then you go and fix it for yourself, how do you get that pushed back into the archives so that you don't have to maintain a package yourself? So that's really the use case that this presentation covers. And I'll go into it. They asked me to leave a few more minutes between after lunch, just because people are filing in from lunch. So that is what it is. I'll give it another two minutes. But yeah, this is awesome, yeah. Anybody have any questions for me while I'm standing here and just kind of wasting time? So the question is that I work at Indeed and I do Linux development and what is that like? Basically, Indeed is now getting to the size where they were small and had mostly application developers and now they are getting larger and they're starting to hit problems at scale that happen to companies that are larger. Previously they would work around it by, oh, well, CentOS is not working, let's try Moji, let's do that. They made your crazy changes and now they've realized that doing that kind of stuff is not always the best because you exchange one problem for another. So my job is really to fix the hard problems that other people run into. So I'm kind of internal support for Linux in general at Indeed covering anything someone, anything a developer runs into and goes, I'm stuck, right, so that's kind of what I do. Get into it again. Yeah, Indeed.com, I work for Indeed.com. All right, well, we are eight minutes afterwards. I'll just jump into it. Hi everybody, I'm Dave Chilic and this is the Ubuntu development primer, how to stop monkey patching and start committing. I'm Dave Chilic, Chilic on IRC and I am what is called an Ubuntu core dove. What that means is that I have archived permissions. That means I can upload things from Launchpad or upload them into Launchpad for inclusion in Ubuntu. So if you have things that you want sponsorship for, take note of me, ping me on IRC and then tell me that you saw me at scale and I promise that I might try to help you out. I'm a little slow because I don't always do as much IRC work as I used to because it's black in my day to day job but I'll do my best to try to respond to those things. Yeah, I'll get into that, I'll cover that. So before I work for Indeed, actually I work for Indeed, I'm a Linux platform engineer. You could also call me a Linux software developer but basically I support anything from the kernel to user space to anything else and unfortunately my day job usually covers mostly CentOS but the entirety of our development environment is all run on Ubuntu so that's why I exist. Prior to that I worked for Canonical, I worked in Ubuntu's sustaining engineering team where I did engineering support for Ubuntu Advantage which is the support arm of Canonical which allows you to do paid support. And prior to that I worked for IBM and the Linux Technology Center which really kind of got me into doing Linux development. My Wiki page is really kind of bare bones but that's proof that I exist in Ubuntu, right? All right, so the topics we're gonna cover today is we're gonna talk a little bit about getting support. We'll cover a little bit about launchpad best practices. We'll talk about how to modify sources from the archive. We'll talk about building and then submitting changes back into Ubuntu. And if we cover all of that, I'll talk a little bit about the kernel processes because the kernel processes are a little bit different than the rest of the packages in Ubuntu because it's the kernel. All right, so why you might care. Does anyone here have self-maintained packages on Ubuntu? Basically, you found a bug in a Python script and then you went and manually hacked the fix for it or you have a MySQL server that you needed to add a patch to. That is what we're gonna cover here today is really how to fix those things officially create a Debian package from it and then push it back into Ubuntu so the rest of the community actually benefits from that. If you're interested in giving back to the community that's another great reason to be here because you kind of need to know this process in order to actually give back your work back to the community. If you're tired of having maintenance burden of one-off packages or say this guy who wanted to do a universe inclusion request for his package that's another great reason. The other thing is I'm a big believer in mentorship beats Wiki pages. Finding someone who can say, hey, you need to search for these terms on the Wiki and that'll lead you to this page that's gonna beat you trying to find those search terms and reading all the Wiki pages hands down. So that is why I'm here. It is a passion of mine for teaching. If teaching paid better, I'd probably be doing that but it doesn't so I am a nerd and I try to teach nerds when I can. So the other problem is that Wiki pages are always, it can be outdated. So fortunately when I wrote this talk originally they were outdated and I've noticed that they've actually gotten way better. I probably should cross that line out. All right, so question to the room. Where does everyone go for support? What do you do when you have a problem? What's your first step? Can I get a show of hands or? Google, that's my first one too. Yeah, anybody else? Any other thoughts on how you fix problems that you hit in a boon to you? Stack Overflow, that is major and Stack Overflow has been an amazing boon to our community. So what, just going straight into the source code. Well, that is fantastic but if you don't know how to get to the source code in a boon to you, that's probably a little difficult. We're gonna cover that and that's actually a big reason that you might want to be here. Yeah, what? S-Trace, yeah, using utilities and actually doing the deep dives. I love what I'm hearing because you guys are the exact people that I wanted to have come to this talk. All right, so yeah, a number of these were covered. Once that weren't covered were the boon to forms, a boon to advantage which has paid support through Canonical. I used to do that. I love those guys. I'm gonna shout out to them here. IRC, you know the channel that I was talking about earlier? Ubuntu, pound of boon to is probably the best first place to go for support. If you don't know if it's a real bug, you think it might just be, you know, pep cat problem like this between keyboard and chair. If a boon to deval and a boon to deval is gonna be for people who've already created a fix and are wanting mentorship on how to actually get that fixed included. If you have problems in that area or you're an actual developer. So avoid posting in there. Otherwise you're gonna be told to go move to a boon to unless you keep it technical and keep those questions technical. A boon to bug, if you already have a bug open and are looking to progress it in a technical manner, you have a solution, you have a workaround, things like that, I'll go to Ubuntu bug, all right. The last one is really gonna be launch pad. If you exhaust it all hope and you actually have that bug or you are about to create a bug, launch pad is where you're gonna do that. Keep in mind that this is not a form. This is not a place to post, hey, I did this and this thing broke. You wanna know that this thing broke because it's actually broken before you start opening bugs in launch pad. Otherwise you end up wasting a developer's time who might otherwise be worthwhile actually fixing a real bug, right. So let's try to keep launch pad not a form. Let's use the forms for form requests and form things. Oftentimes we'll see people open threads on Ubuntu forms that end up being opened as bugs in launch pad. They're like, oh, you know what? We've exhausted all hope. The community hasn't really, we haven't figured out a solution so let's just open a bug and we'll go from there. All right, so launch pad. So let's assume you found your bug on launch pad. A lot of people love to post, oh, this hit me too, why are we not fixed this? This hit me too, why have we not fixed this? And people are not clicking this button. This is a very important button for me as a developer to be able to help to triage whether or not people are actually seeing this bug. So please do not put comments in unless they're technical and useful comments. If they're me too comments, don't use that. Whereas this bug, if you click on, can we read that? That's a little bit low resolution but that line says this bug affects five people. Does this bug affect you? If you click the little pencil next to it, you can click yes, this affects me too and then you can subscribe yourself to the bug and what ends up happening and that increases the heat. As developers go through and actually look to see what bugs most people are hitting and which are gonna have the most impact if they go and work on and actually fix in it, they're gonna be looking at the heat to try to determine what's the most important bugs. And then of course, make sure you subscribe yourself because you don't, I can't tell you how many people will create a bug, they'll fire and forget it and never actually respond to requests for more information. Yeah, number 34 is a, so it's a number of like, how many people have clicked this bug affects me too? How many people, how many duplicates of this bug have been closed as a duplicate of this bug? And there's a number of other heuristics. I'm not exactly sure of the exact algorithm but it's basically that's kind of what it is. It gives you a raw, a good pool of thumb. Also, if you see a bug that affects you and there's like a two heat on it, you're gonna be kind of be like, oh, that's why it's not fixed, right? All right, so let's say you have a fix for it and you know the bug and you have a fix. Next thing you're gonna wanna do is assign yourself. If you go through the effort of trying to fix it and you don't actually, you give up, make sure you remember to unassign it. Otherwise, the Ubuntu Gordes or the Ubuntu developers are not gonna know what's the status and they'll probably ping you and be like, hey man, are you working on this, right? All right, and all of these things, adding comments to the Launchpad, creating bugs, assigning things to yourself, fixing them, all results in the thing called Karma. And as you can see my friend Colin Lawson here who's actually one of the Launchpad developers has a Karma of 80,670, which is insane. It's the highest one that I know of, mine is way less than that, since I don't do Ubuntu development as my day job anymore. All right, so what do you do if there's no related bug? The first thing you need to do is you need to make sure you update your package and make sure your package is up to the latest version because if you are creating a bug on an older version of the package, first thing a developer's gonna do is gonna be like, hey man, you're back level on this version, let's get up to the proper version first. Next thing is make sure you include all of the logs, including possibly debug output if the command you're running has a verbose option. And if there's any memory dumps, include those as well. And the next thing you might wanna do is test the Ubuntu development packages. And by development packages I mean like, today that would be disco, right? So that's the release that's in development, that'd be the latest thing, right? The reason we want that is because that would tell us whether or not there's already a fix in Ubuntu package. Sometimes if it's a small delta between the version you're on and the one that's in development, we'll just do a back port of that package into the back port's archive. All right, I had a few questions. How do you get a memory dump? Typically you're gonna have to set you limit, you're gonna have to configure Ubuntu limit to actually save memory dumps. I think it might actually be included by default. You might be on by default, you might have to check. It might just be because I have it turned on because of all the packages I've installed. But you usually have to do that. If it's not dumping for you, you don't need to create a memory dump because that's gonna be just way too much effort. Including more, including a step-by-step how to reproduce the problem would be way more important, way more useful. Other questions? All right. I know this is probably remedial given how advanced this group seems to be, but it's important to go through because bug etiquette on Launchpad is kind of not obvious. All right, so given that all of these questions, a better way to create a bug on Launchpad is actually run Ubuntu bug against the package you're running. The reason for this is this will actually go through and collect all the logs for you, and then it'll open up a browser with your, with the airport output. And sorry, that thing is awesome, but I just saw it and was like, oh my gosh. It'll create a, sorry, it's threw me off, it'll create a number of, it'll create all of your logs and it'll create the bug for you, and then it'll also go and check for duplicates based on the information it grabs. So what this will do is it'll all, it helps triage the bugs for Ubuntu without having to actually have someone doing the triage. It allows us to have distributed triage, essentially. So that's cool. All right, so let's say you identified that it's a real bug, there's no duplicates, you've opened your bug, or you've assigned yourself to the bug that you've discovered. What's the first thing to do? You've checked that the upstream series or development series doesn't have a fix for this. The next thing to do is really check the upstream project. So let's say you have a problem in MySQL. It's not fixed in the disco release of MySQL. Next thing would be to grab MySQL, build and install it as MySQL recommends, see if it's now resolved in MySQL's upstream. That would be, that is a good source of, that is a good point because then what we can do is we can do a git bisect on MySQL to identify the patch that's resolved your problem. And then we can actually backport that patch back into the versions that we care about. So for example, it might be Xenial, Xenial's mostly security updates right now, but if it's a bad enough patch, it goes Xenial, Bionic, Cosmic, and then it'd be going to the development release as well. If it's not fixed in the upstream project, really stop talking to Ubuntu about this because there's not much we can do. The process with Ubuntu is to pack it, because it's a distro, what we do is we package upstream sources to make it easy for people to install. If the upstream sources do not have a fix for the problem, first thing to do is to fix it in the upstream project. So once it's fixed in the upstream project, we can then get the patch accepted back into the Ubuntu archives because we want the people who understand the sources the best to vet that patch before we then push it into Ubuntu. So that's kind of the process for stable, for the way we keep Ubuntu stable, is by making sure that things that go into the develop, things are forced to go into the project's upstream before it comes back into Ubuntu. Everyone clear on that? Or did I, did I, I got a lot of net head nods, good head nods, all right, all right. So what does it look like once you've got that patch, it's been upstreamed or you've identified it in the upstream. The next thing to do is to really grab your sources from Ubuntu, create a patch in Ubuntu in the Debian patches directory. You're gonna edit your changelog, I identify in yourself as the patch owner. You're gonna build it, test it, submit the dev diff back to Ubuntu, which is actually a little bit more interesting, so I'll cover that, and then we're gonna test it some more. All right, so getting the sources. First thing I like to do when I know I'm gonna do build, when I know I'm gonna be doing developments for an Ubuntu package is I make sure that I grab a bunch of tools that go around Ubuntu development. You'll see Ubuntu DevTools, DevStrips, Build Essentials, FakeWeb, FakeRoot, Eternal Wedge, a lot of these tools are very, this is kind of like my de facto copy, paste, install when I've created a new machine because I know I'm gonna need them at some point in time. You might need some other compilers or interpreters based on the languages that you're looking at, but this is a pretty good coverage of most of them, yeah. Has anyone heard of Armadison before? All right, I'm gonna blow your mind then, because this is amazing. This is one of my favorite tools for finding things about Ubuntu, but what it does is it goes and shows the version package of every support, version of that package in every supported version of Ubuntu, actually every supported archive of Ubuntu, so it'll even split out what went into the original archive and the updates archive and then maybe it might even show you what's in proposed, if proposed it's different than the updates. And then I'm gonna just make a directory, in this case I might be looking at Bash, and then I'll pull LP source on Bash Bionic. Everyone, now when everyone's looking at this, they're going what is pull LP source and why am I not doing app get source? Well the problem with app get source is app get source only works if you have the app source lines or the deep package source lines in your app package lists. What happens if I'm running Bionic on my laptop and I'm trying to fix something in Disco? Now all of a sudden I'm having to add the app sources, I'm having to add the deep package source lines for Disco in my Bionic release on my laptop and that's not a good idea, because now mixing series is not a good idea. What pull LP source will do for you is grab the sources directly from Launch at Pad for the series that you request. Yeah, let me get to my demo and if it doesn't answer your question then we'll ask it again, okay? All right, so what you're gonna see when you grab these sources is you're gonna see a Debian directory. I'm not, I wasn't able to attend the last section which was covering a number of package, the last session for Ubicon which was covering a bunch of packaging things. But basically what you're gonna find here is you're gonna find a change log. That change log is not the change log that you might think it is. That is the change log of all of the changes of the delta for Debian and Ubuntu on top of the original sources that we forked from. Okay? You're gonna see a rules which is essentially a make file that helps build the package. You're gonna see a control file which has all the package definitions. So for example, if you're running MySQL you're gonna have MySQL server, you're gonna have MySQL common, you're gonna have MySQL libs. There's gonna be a whole bunch of packages. All of those package definitions are gonna be in the control file. You're gonna see a patches directory. Now this is what we really care about as developers trying to fix Ubuntu and push fixes back into Ubuntu. The patches directory is basically a delta of what Debian and Ubuntu have added onto what the original package that came out of the upstream sources. So it's basically the delta on top of that we forked. We forked and this is the delta. There's a patches series file which determines the order in which those patches get applied and then you're gonna use what patch inside that source tree to determine what patching mechanism is used in order to control these patches. And all of this is described on the Debian or Packaging Wiki which is what that man right there is gonna need to know about in order to get things started to be pushed into, started to be pushed into Ubuntu. So creating that Debian package, I don't know if you've gotten that done but that's step one. After that you're gonna have to create a universe inclusion request. Yeah, is there a one to one? I cannot guarantee that but I'm pretty certain that is true. Yeah, so yeah. Which brings me to my demo. All right, let's see, no one can see this. Tell me when you can see, start reading in the back there. We're good? All right, so we're gonna go, our Madison takes a second because we're hitting the archives. All right, so this is pretty, this is a ton of text, the ball of text with the columns that we care about most is the second column and the third column. If I highlight stuff, great. So there was a question earlier as to how to know what version of Ubuntu we forked from. Typically what ends up happening is when Debian forks from an upstream package, they create a version. They use the version from the upstream package. Usually this is an upstream tag, right? So in this case, bash this was probably a four, three. Then Debian will add a hyphen Debian revision number. And then when Ubuntu forks from Debian, we will typically create an Ubuntu keyword. And then the, all right, so basically we can do a quilt, pop minus a, which will remove all of the patches and give us exactly what we had when we originally forked from the upstream bash version. And I can do a quilt, push minus a, which will push everything back on. That's the basics of quilt. There's also, I'll get a little bit farther on, but one thing to think about with quilt, which is not always the best, huh? Battery, all right. So quilt is great. You're gonna have to know a little bit about quilt, but it's not my preferred way of doing development. What I prefer to do is make my changes to the sources. And then what I'll do is I'll do a DCH minus I, which you're gonna have to do in both cases, which is we'll actually increment the change log and update the change log. And then you can do a D package source minus commit, which will do a diff between your current directory, your current sources and the original package, do a diff and create a dev diff, create a patch for you from that, which is actually way more automated. All right, so I can, what, yeah. Huh? DCH just stands for Debian change log. It's just another utility. I'm throwing it out there. It's easy. You could manually edit the change log, but I'll show you in the demo exactly why you use DCH and not manually editing the change log. So DCH, D package source minus minus commit, creates the Debian patches, creates the patch in Debian patches for you, and then it'll add it to Debian patches series for you at the end of that file. And then it'll launch a text editor and ask you to fill out a depth three header for the patch you just created. Dep three is the standard that Debian and Ubuntu use to tag each patch to identify where it came from, who wrote it, and where it's going. It's essentially the three major things. There's probably a few more that someone's gonna yell at me on the internet for, but whatever. So DCH minus I, you're gonna edit the change log, and this is gonna add a change log entry. It'll include, you're gonna wanna include the Launchpad description, the Launchpad bug number, because what this does is once it gets included in the archives, it'll automatically work through the workflow in Launchpad by incrementing the bug from in progress to proposed to released, right? One thing to keep in mind is you're gonna wanna try to follow the format of the change log of the other entries because we're based on Debian. Each Debian maintainer keeps a slightly different version of how they maintain their change logs. So we wanna kinda just keep it identical, and the best way to do that is just to lower down on the file. And then you're gonna wanna make sure that your patch actually got listed in Debian Patches series. I haven't ever seen it not, but it's a sanity check. All right, so we're gonna now edit Bash. Can I get my mic stand again? I am so sorry, man, what's your name? George, nice to meet you, thank you, man. All right, so we've now got Bash. Has anyone wanted to screw with their coworkers? All right, all right, so this is awesome. I happen to know that if you grep for PS1, anyone know what PS1 is? It is the shell prompt, yes. So I happen to know that if this gets evaluated in y.tab.c because I'm a responsible presenter and looked this up earlier, I'm gonna look for PS1 and I am going to comment this out. PS1 prompt equals dollar space, quote. All right, oh, actually I'm gonna do it wrong first, right? So what ends up happening? I made my changes. Now I'm going to, what do I do next? I asked the question, next question was, what do I do next? I made my changes. Did anyone memorize my slides quickly and figure out what I had to do next after I made my changes? Anybody, anybody? DC8 minus i, yeah, that's the next thing to do. All right, this is what DC8 does for you. If you don't use DC8, you're not gonna get this Bash version number. So the dash i says to implement the version number. You notice that it immediately went from dash two Ubuntu one to dash two Ubuntu two. It set unreleased because it doesn't know what archive it wants to go into. And it also added my tag. This is actually included in, it added my dead email, which is a Bash, a BashRC environment variable. And then it also, you put the timestamp for when I made the change. So hold, barrel, thank you, barrel, you deserve this. And what I would do now is I'd also make sure that I'm solving a launch tag bug. So if you notice down below, we have launch tag number. We're gonna be solving one, two, three, four, five, six, seven. All right, oh, last thing we need to do is need to worry about which version of Ubuntu this is going into. This is gonna be going into Bionic because it's a Bionic package that I'm picking. All right, cool. Yeah, to edit the what, the configured file? Yeah, I probably could. I don't know, this is just demo purposes. This is what I figured out in three seconds. All right, so everyone noticed that I had a bug in my T, because we're all awesome developers. And now I've, but sorry, before I realized that, let's say I did a dpackage source, minus, minus, commit. If I could spell commit. And now it asks me for what my patch name is gonna be. Let's make it Pone, the patch. And it brings up this great little, great little template. This is a depth brief template I was talking about. This is what the Debian developers, Debian and Ubuntu developers are gonna use to identify what in the world this patch is and why it exists in the archives. So what you're gonna do is you're gonna create a good, you're gonna wanna have a good description here, not just Pone. And you're also gonna wanna also include things like the launch at bug number. So you see bug in Ubuntu. You see, if I highlight this, is that good? You notice that it identified the bug that I used simply from the LP colon number that I used in the Debian change log. But you're also, because you're pulling this from upstream because we're all responsible developers and we pushed our fixers upstream first, you're gonna wanna put origin, you're gonna wanna fill out this origin tag and give the URL of the upstream origin patch. This way, when you submit your depth to a Cordo, they can actually go and verify that which you are committing is the same thing that was coming from the upstream, possibly massaged in order to fix, to work with the sources you're looking at. But it gives them a good place to start with. You gave them as much information as you can to approve your information, approve your submission. And fill out any of the others that might be appropriate. There's actually a depth-free information on the Debian packaging wiki page that I linked earlier in the slide. All right, so I don't even know if this is gonna work. Well, it's okay, great. All right, so we covered that. Now, back to my problem in my C code, right? So now I've created the patch, do I wanna undo everything and then redo the patch? No, let's go back to my, why.tab.c, go to back to ps1 and I notice I missed this semicolon. So I added the semicolon to why.tab.c, but I've already got the patch in the series file. This is why you wanna know quilt. What I can do now is I can just do a quilt refresh and then it'll just refresh my patch with the new version of changes that I have, okay? So that's kind of how my debug, fixing bug workflow goes through is I kinda use de-package source commit when I think it's ready and then I'll use quilt refresh after that. All right, any questions? Did I go too fast or too loudly? All right. The next thing you need to do is you need to build a package with de-build minus D and minus capital S. A minus D is gonna say ignore build depends and the reason you wanna do this is because you do not want to build packages on your machine directly. You always wanna have it in a container of some sort because otherwise you're just gonna muddy your machine with a ton of packages and not be able to get rid of them. The second thing that it does is it signs the package. You notice that I had my, I used my dev email signature for the dev and change log. It's gonna use that dev email environment variable as well to identify which GPG key it uses to sign my DSD file. Okay, so that's what it's doing with the dash up. The next thing we'll do is then once you create your source package, you end up building it using S-build. S-build is the best way to build packages for Ubuntu. It's basically a, it's kind of a true build and there's a whole, there's a great wiki page so I'm not gonna go through each command on how to deploy S-build. Follow this, every dev and developer, every Ubuntu developer follows this every time they create a new laptop or buy a new laptop. And then once you've built your package and your shirt filled and your shirt works and what you wanna do is you're gonna grab a dev diff of the original package and the package, your new package. And what that is, is that's a, just a, it's just the difference between that you've created, right? And that is what you're gonna upload the launch pad for a devian developer to sponsor. I keep saying devian, but devian Ubuntu developer. All right, so we go to our demo. S-build, minus, B, minus, F. It looks weird, it's just big. All right, so we're building our source package. This is the source package that will end up eventually getting built and being pushed into the launch pad builders for creating of the package. You'll notice at the bottom here, you'll see that sign file for the changes is signed by dave.davechillichillichitubuntu.com. All right, next thing is to S-build, minus, A, minus, B. Let's see what that created. All right, so we now had, we previously had just Ubuntu one sources. Now we have an Ubuntu two sources, right? And what we're gonna wanna do is we're gonna wanna build this dsd file. Now what S-build is gonna do, it's gonna spawn a new treat in bar run strude mount bionic. And then it'll update itself to the latest version of all dependencies, including all of those in propose. And then it'll run the build. And we're not gonna sit here and watch this because this, I think, takes like 20 minutes. But you'll see everyone saw app go by or it's going by right now. App.js is going by right now. All right, so the next thing we need to do is we need to test the package. So how would I test this? Easiest way is to install it on Daryl's machine and that's it, right? Unfortunately, since Daryl's not here, I am going to just install it on my local machine to show that it actually worked. I've already got it built because I've done this before. And I'm just going to repackage my I so you can't see that on bash. And it's now installed. It is now installed. So if I want a new terminal, I've now been phoned, which is, right? Fantastic. We've patched resources in Ubuntu and now we've done the hard part. Now the harder part is actually getting it accepted and getting it pushed into the archives. So how that works. Let's actually, so we've tested our package. Now what we're going to do is we're going to create a DevDiff and attach that to launch pattern. So what does that look like? DevDiff. Can everyone read some of this? Or is it just way down at the bottom? All right. So everyone's seen a DevDiff before, right? That's all this is. It shows us the change log change. The only interesting thing is that you'll see that the patch that we created is actually added as its own new file. So, phoned.patch in here is actually created as its own new file. You see all the pluses at the edge here. Great. And that's what you're going to upload to Launchpad. What you're going to need to do though is now you've just solved it for Bionic. You're going to need to fix it for Cosmic and Disco because you can't have Bionic fixed and then have someone upgrade to Cosmic or Disco and go, hey, look, it was working in Bionic and now it's broken again. So you're going to have to create three different DevDiffs. Typically this is pretty easy because the DevDiffs for the three different versions might be really close and the same patch might apply at all three. But if it doesn't, you're going to have to port those patches around to make sure they clean the apply. All right, now the next thing to do is once you've uploaded your DevDiffs onto Launchpad, you're going to have to fill out what is called an SRU template. SRU stands for stable release updates and this is the wiki page you're going to want to look at in order to grab the template for the stable release updates and you're going to copy and paste that template into the top of your Launchpad bug. Okay? Everyone follow that? I think I've been talking for a while. Now what this is useful for is this is useful for the core Dev that's going to try and sponsor your patch. They're going to look at this SRU template and it basically asks you things like how do you recreate the original bug? What I will do as a core Dev when I'm sponsoring a patch is I'm going to go and actually try to reproduce that on preferably in an LXD container. I'll try to reproduce your exact problems given the steps that you have. If I can't I'm going to be like, whoa, something's weird. Step two is it'll also tell you how you've tested it, how you've tested it, how you push it upstream. What are the regression potentials? We don't like breaking people in a bunch of you. We prefer for them not to ever have it work and then have it break after an update. Unfortunately, that happens but the reason that happens is because we aren't creative enough when we are going and reviewing these things and thinking about what kind of regression our potential. Do you want to do your best to kind of identify what regression might be in play? I recently uploaded a fix for a USB headset and I said the regression potential is kind of minimal because it currently does not work at all, right? And the fix adds only affect the USB IDs for the headset that I'm applying it to. So kind of hard to have a regression there. Next, the next thing you're going to need to do is now you've done all of your bug, you've done all of the administrivia to try and get things done. You're going to now have to wait on a Core Dev to actually do their side. The way to do this is to subscribe and wouldn't do sponsors to the bug. And then if that, you don't get anyone to respond in a timely manner, you're going to want to go to Ubuntu D.Vel and probably ask for a Core Dev. Be like, hey, can anyone sponsor this patch? I'd really like to get it uploaded. If you ping me, I'm Chilic on IRC. I will do my best to help out the community as I am a community of Ubuntu member, Ubuntu, I'm a community of Ubuntu Core Dev, right? So a lot of the Core Devs work for Canonical. I would like to try and help the community get their fixes into it. Now once you've got it uploaded, once it's uploaded, it doesn't actually get immediately built. Because we want to keep Ubuntu stable, we want to make it very difficult for changes to go in that'll affect everyone on the planet, right? So we have at least two people who have archived permissions reviewing before. And the second person that needs to review it is the SRU, is an SRU team member. In order to get their sponsorship, you're going to want to subscribe Ubuntu SRU to the bug. And the reason you need to is process and safety. We don't want to break, we want to break as few people as possible and the best way to, we don't have a better way. All right, now once the Ubuntu SRU team member approves your SRU, it will get built by Launchpad and it'll be pushed into the proposed archive. This is when you go and look at your Ubuntu package manager and you ask, hey, click, I want to run proposed because I want to help out the community. Exactly what you're doing here is you are running the packages that were most recently changed, right? Once it hits the proposed archive, your job now is to install it from the proposed archive because who knows, your package might get built slightly different on Launchpad or than it does on your local machine. If you're using s-built, this is incredibly rare. If you're building it using Debuild or some other package built mechanism, it's more likely. I've actually seen instances where I used a previous tool called Key Builder where it built with Key Builder just fine, worked with Key Builder just fine, but when I built it with s-build, it failed to build or it had a completely different functionality because the dependencies were pulled in slightly different. All right, so again, it hits proposed, you gotta test it again. Once you've tested it and proposed, there's gonna be a tag on Launchpad that says verification needed. Now you need to move that tag from verification needed to verification done for every series that your patch is applied for. So if it was applied to Bionic, Cosmic, and Disco, you're gonna need verification done, Bionic, verification done, Cosmic, verification done, Disco. Because if you don't do that, Launchpad will automatically remove your package from the archives. After, I think it's like a three or six week timeout. All right, and now because we are awesome members of the Debian community, the next thing we do is we're gonna submit it back to Debian and there's another tool for this called Submit to Debian which is part of the Ubuntu Dev Strix and it will actually take your Debian and create a bug on Debian's bug tracker and upload that Debian there. The other reason for submitting it back to Debian is we do not want to have a Delta with Debian. What we do as Ubuntu is we stabilize the unstable or testing packages that we get from Debian and have a decent cadence so that it's predictable and useful for normal people. Debian is great, we love Debian, and we want them to succeed as much as Ubuntu succeeds and by submitting our patches back to Debian, we help them out as much as they help us. Or we try to help them out, they do an awful lot of work. All right, so who cares about the kernel? I am, oh, I got two minutes. Do we have anyone that cares about the kernel? All right, so when I was working for Canonical, I was actually part of the sustaining engineering team and I was actually attached to also doing some kernel work with the kernel team. So I spent a lot of time talking with these guys and I know them all pretty well. So let's just jump into it. How do people get sources for their kernel? For Ubuntu? Anyone not reading my slide and want to tell me? No, no, no, no. But that's a good try, good try, no. So that's exactly what a lot of people tell me is when they're like, oh, I just go to kernel.org for the sources. Well, the problem is Ubuntu has a delta on top of the kernel as much as they have a delta on top of Debian. And the reason for this is that we want to make Ubuntu function for as many people as possible. And there's lots of reasons why we might have a delta from kernel.org. One reason might be hardware enablement, right? So for example, my USB headset is a bad example, but let's say there's a new version of the processor that we need to make sure works on our long-term support for these, right? So they might have patches for those processors that didn't go into Linux stable, but actually did get pulled into the Bionic tree. The other things that you might have is you might have, I'm drawing a blank right now, but what is our security mechanism in Ubuntu? Anyone know? What? It's not at the Linux. What's the other thing? App Armor, good, good, good. So the other reason we want to not pull from the kernel.org, is kernel.org isn't gonna have patches for App Armor, whereas the ones you pull from Ubuntu are. Okay? The other nice thing to notice is that we're giving you a Git tree. Who in CentOS land is getting a Git tree for their kernel sources? Absolutely zero. If you're running rel or CentOS, you're getting an RPM source package and a tar ball. We give you a Git tree with every single one of the patches broken out in that Git tree. It's not just a fork plus patches that we have for all the rest of the packages. It is a full history of that Git tree. It's really great. The other nice thing is you can do, when you clone this, you can actually reference the upstream Linux tree. So in, you see Git clone, I referenced Linux here on this slide right here. That will actually decrease the amount of space that it takes up on your machine. Once you've pulled those sources, there's now two branches that you need to care about. There's master, which is gonna be what is sitting in updates, and there's going to be master next, which is what is sitting in proposed. So depending on what you're looking at, that'll be the difference between those two branches. The other thing you're gonna need to know is that the, sorry, the other thing you might wanna know is that there are also gonna be HWE branches for the LTS releases. So let's say you have a problem with the Ubuntu kernels. First thing, you're gonna go through a very similar process as you would with any package. You're gonna wanna make sure it's first fixed in the upstream Linux kernel, like five, one, leading edge, RC, non-existent right now, right? Easiest way to do that is actually to use the Ubuntu kernel mainline build. If anyone's running these kernel mainline builds on their Ubuntu machine, probably should not. These do not include all of the patches. These are very bleeding edge. These are very much development. I would not recommend this ever. But in order to test to see if your stuff is fixed in an updated kernel, this is the fastest way to get a kernel that is at the latest in order to determine if your kernel's actually been fixed. Once you've discovered that it's fixed in these upstream kernels, then you can go ahead and bisect it and bring that bisection back. The patch that you discovered from that bisection back into the actual Ubuntu kernel. All right, so you're gonna bisect it, find that correct commit, and then you're gonna do a git cherry pick, minus sdx, which is sign, edit, and give me a cherry pick commit isepatch on the upstream commit, and that'll just apply it to the Ubuntu tree. That'll allow you to use git send email to push that patch back up to the Ubuntu kernel mailing list. All right, if it's not already been fixed in the mainline kernel, the first thing you're gonna wanna do is fix it in the upstream in the mainline kernel, and then possibly just submit it back into Linux stable, because you're a good community member. Committing it to Linux stable will actually allow it to be included by default by the Ubuntu kernel team. The Ubuntu kernel team does their best to follow Linux stable rules, but again, we have the directive of making Linux easy for everybody. So there are times when we will accept a feature patch in order to fix something that we know is affecting a lot of users. And I say we loosely because I'm actually not on the team anymore or in the company, so don't yell at me. But in order to submit those patches back, you're gonna follow the documentation stable kernel rules. Actually, I think it's like documentation process stable kernel rules now. The documentation has actually changed since this. I thought I updated that, but oh well. Once it gets pushed into Linux stable, the Ubuntu teams will eventually include it. However, a much easier way to resolve problems with hardware dependencies is to actually run the HWE kernel. Now the HWE stands for hardware enablement kernels, and if you are buying new servers today and are trying to install venue, your best bet is to run the Linux HWE kernels in order to get those to run as best they can. You're running Bionic and trying to install servers from AMD. Again, HWE kernels, right? But those are basically the new series of kernels brought back onto the LTS release. So for example, in Bionic, that would be the Cosmic kernel. All right, so you've submitted to upstream. Oh, there's the correct link. This is actually the where it lives in the kernel sources. So if you download the kernel, you'll see documentation process submitting patches. We'll describe how to submit patches back into the kernel community. I have a whole other talk about submitting patches to use back to the Linux kernel. Submitting it to stable, we've already covered. And now I'm building your kernel. So typically, when you go to build a kernel, you're gonna do a make old config and then just for quick iteration if you're actually creating your own patches. This is not recommended for doing any kind of distribution to servers because you've just got a VM Linux and a folder of modules. It's not really easily installable. It's not really easily anything. If you're building off of the Ubuntu kernel trees though and you alias FDR to fake root W rule, quick note there. If I do FDR prepare generic, that'll create the config for that kernel that I'm trying to build. And then it'll, if I do FDR binary generic, that will actually create Debian packages that you can then distribute elsewhere. All right. And just a quick note, if you're doing development and you just need to build a subset of your tree and you're using the Ubuntu tree, this is how you would point it to the build directory so that you can build just a small subset of the tree. Other patches. So we have this idea of soft patches in Ubuntu and those are the patches that I was talking about that are not Linux stable appropriate but still are, they may not even be mainline appropriate but they're something that we consider valuable and useful to the users of Ubuntu. There aren't very many of them but they do exist and they make our lives and the user's lives better and that's why we keep them. Sometimes they're not accepted upstream for political reasons and I'm not gonna discuss that in the program so. All right, so once you've got your patch, this page is actually great for describing where to grab, how to deal with the kernel community, the Ubuntu kernel community and they're usually really responsive on that mailing list because it doesn't have a ton of mail so that's how you push your fix back into Ubuntu is actually, how you push your kernel fix back into Ubuntu is actually pushing it onto the mailing list which is very similar to what the upstream Linux kernel process is. All right. Any questions? Yeah, yeah, oh yeah. No, I don't have the handheld. Here, let's get the handheld to you before we, is that it right there? All right, check, check. I have a screen link up for you actually got something out of that talk and it makes me really happy. I hope to be sponsoring. Yeah, we're, is that, okay. Yeah, cool. Ubuntu supports a variety of non-X86 hardware. Where does that get tested? So what, you're gonna have to be more specific on that question. So say I have a patch and it builds fine on X86 but when it's compiling it breaks for PowerPC or ARM or something like that. So I'm just trying to understand what the workflow is for Ubuntu for non-X86 hardware. So are you fixing a build problem for the alternate architect, ARCH? Or are you? I'm just trying to, I'm trying to under, let's say I'm working on X86, that's fix the build for me but it breaks for PowerPC. How does Ubuntu do builds on all these other architectures every night? Okay, and are all those self-hosted or cross-compiled or? You're not sure? No, it's not, it's not cross-compiled. Ah, so, okay. Good. I know the... So the bottom line is it's not really the developer or the patch writer's responsibility to test other architectures. So if it gets pushed, if you end up breaking another architecture, you're doing something really awesome. But it will get built by Watchpad and you'll see a built-in failure. It's actually what all of them happen. I actually have it broke another build, broke a build for an alternative arch, so I don't know. We do have a page called FTBSF. If you go to the wiki.wiki.com and look for FTBSF, it's called failed to build from source. And that'll show you all of the packages that failed to build from source for alternate arch. For any architectures. Typically that's used while we're doing plus one maintenance, which is the process we use for bringing Debian back into Ubuntu. And that is typically the place when we find failed to build from source problems. But if you wanted to work on alternate arch and execute that for you, how it will start. Thanks. Yeah. Anyone else? And thanks for coming. I understand this is awesome. It was an excellent talk. If I followed along correctly, there's a version of proposed for every version of Ubuntu. Oh yeah. And where do I get proposed from? You add the proposed archive to your app's sources list. And the easiest way for an end user to do that is to go into the wuji software center and click the proposed button under wuji software. I can actually show you. Any other questions? Thank you. Updates? Oh, fantastic man. I'm giving another one on a kernel basic or kernel debugging basic. Stattery at 130. So this is what one of these tabs has a proposed, tested proposed archive. Stricted main, universe, proprietary. Yeah, there you go. Bionic proposed. Okay? That's the easiest way to turn on proposed. You can actually just modify the app's sources with directory. One of the tools you used to actually build the software in its own container. And so you don't have to worry about it impacting any of the, the system. That simple, that's the wuji page is invaluable in terms of doing that and doing that without screwing up your machine. Yeah. That is the best way by far. Well, I'm a. Thank you Dave. Thank you for asking Greg. Some of the things that there's no like real good structure to be then like the shift of a health calculator. I'd be happy to be able to use my house documentation or my car's documentation. Yeah, I just, I was going to just make a suggestion that I don't know if it's, if it's viable that maybe the team or group of individuals who are doing the documentation have a core, a shared core, the specialty, the specialty versions of the different languages could kind of parse it out and say, okay, this is a one two module. This is that module. And then the shared core is kind of, you know, doing it that way. It would save you from having to probably spend, you know, a thousand years to try to keep up. You know, so I just wanted to offer that as a suggestion. I'm, I'm, I'm, I'm, I'm, I'm, I'm, I'm, I'm, I'm, I'm, I'm, I'm, I'm, I'm, I'm, I'm, I'm, I'm, I'm, I'm, I'm, I'm, I'm trying to document this from a user perspective. Do you, do you think it's kind of, is it, is it, is it, or is it. Worthless, and it's a product that's, that's important for those Frankie fans. You know, it's just it's really always important, so they, so if you see unterwegs, so they're taking on the user's耶. So, so you'd have this right and meaningful skills in technical or business writing, and you've got some time in your hands and interest to find a project that you like and contact the team leader or the documentation team and say, how can I help? And you will be surprised how open those arms are. The documentation is sorely under-serviced. Lynn, how do you go about getting the ideas as to what to put into a piece of documentation? Do you run the program? So you have no access to the source code as your source of information? There's a... for a binary... I don't know how to explain this to a non-technical leader. I think I'll just use that out. So it's actually, I think, looking at the source code, beginning it out, I'm not sure how to make that into a way, into the way of thinking as someone like, that would just be non-technically. I have to make it a lot harder to read anything doing it that way. Any other questions? Anyone else wanting to help with docs? How about a round of appreciation for Lynn? All right then, we've got a break, and... Sorry, I got stuck there. What's that? You did go a little quick, but that's all right. A little bit of breaks gives us more time to chat amongst ourselves, and that's okay. And then you can think of, you have extra time to think of the questions that you need to know the answers to. They are maybe related to Ubuntu, which is kind of what we're here for. We do an Ubuntu Q&A, so that's what we're doing at 4.30. But if you have any question, you feel free to bring it. We'll see if we can answer it. And the standing rule, as we've always had every year in this tradition, remains if we don't have an answer, we will research it and find the answer by tomorrow morning when we come back. That's what we're going to do. And Dave, who was here earlier, is going to help as well. So it's Nathan and I and Dave. So we've got smart people and me, and we'll be able to answer questions. Lynn, thank you for helping with Ubuntu and for showing people what it takes to put together great documentation. I appreciate that. Is this manual downloadable at lubuntu.org? But people can download and see the manual there. Nicely done. I appreciate the commitment to keeping it up in such an ongoing way. I'm going to start basically over with everything changing. So it won't be super... I find ways that you can improve it rather quickly. Once I'm like, oh, everything changed. There's going to be way more changes than you can be made. Well, you're going to be busy. But here's a historical change I didn't know until your talk. Because I'm running an older version. Between 1804 and 1810, they changed the desktop. Is that right? Yes. And it's now... It was XCDE and now it's... It was XCDE. Oh, sorry, yes. Why the change and what are the benefits? I've been contributing since 2013. It was around then. So it was still built on stock from before. It started contributing. And now I'm like the second-longest contributor on the team. So if you've got in mind, and that's in GTK2, who would know how to fix that now? How many people do you find that has been around since those days anymore? Did that change affect the system requirements much? Now we know. I wish I had something to do. But this is actually my oldest version I use currently. And it's from 2009 in the support of Euro, which I find I have. But I understand getting a whole bunch of cars from that. But I think the thing is, 804 is an LTS. Which is why we changed after an LTS first and it was also getting everything way more polished by the next LTS. Any other questions? There are two bits getting hard. You're looking on the frontier out there. Answers. Yeah. The accuracy is at least as much to be desired. So underneath? Underneath. Underneath. I'll put it out. And you're on. Hello everyone. Dave is going to help today. And Nathan put him on the slide so it's official. But this is our Ubuntu Q&A thing. We do this every year as always. I'll ask any question you want. Ideally about Ubuntu. But it can be anything. Anything at all. Preferably about Ubuntu. We're more likely to be able to answer it soon if it's about Ubuntu. If we don't have an answer, I will research it. We will have answers tomorrow morning when we come back. That's the promise. So who wants to be first with a question? We have a question already. Where's the group shot from last year's Ubicon? They're like LTSs. Every two years. That picture will last for a long time. We're going to support that picture. So yes, it'll take us. We'll get it out. It's in beta testing. And then we'll get it out within two years. January 2020 thereabouts. Maybe 2022. Maybe 2024. That's a very good question. We started to build one. And then we had this really great ubicon.org site. And then we had sections within that for the various ubicons that happen around the world. Because there's the one in Florida. There's Europe. There's Latin America. And the canonical community team was very excited by this. So they flushed it south. They put it on a really good CMS. I bound up with the domain years ago through accidents. He had originally registered it. So we pointed it there. We got it all set up. And then with the canonical reorg, the server, what is the current status of the server? Show us what that looks like. Okay. So I'd like to thank everyone for being here. My name is Nathan Hain. And I am the leader of the local source. It's just crazy about the business. And I'm in charge of running the business at scale. And California events and all those. And I'm also on the local community council. Which means that it's globally worldwide. And we local ones supply services for events. And I've been at scale for 12 years this year. And I've been having that. Hi, I'm Dave Chilic. I work for Indeed.com. And I am an Ubuntu core dev, which is why I'm standing here right now. And that means that I am someone who can actually sponsor fixes and push changes into the archives for Ubuntu. So I am more of a technical person, but I'm still part of the community. I am not employed by canonical. I used to work for canonical for five years. That's when I actually became a core dev. But the cool thing about Ubuntu is that when you become a core dev as a canonical employee, you maintain those permissions after you leave. So, hi, that's me. My job is a lot less interesting. Oh, that's not my job. No, well, history, background. I designed software. Been doing it for a while. My name is Richard Gaskin. And I'm an Ubuntu fanboy. I spent many years doing work exclusively for Mac and then branching to Mac and Windows and then clients started asking me about this thing that just seemed ridiculous. It's free. They give it away. They were calling it Linux. They said, can you make software for Linux? I said, why? What? Okay, I'll try it out. Well, I got kind of hooked. So you guys have been seeing me ever since. So that's why I'm here. Hi. I don't know if you guys know this. Maybe it's too specialized. So we use Dell servers at our place. And in the past, Dell only supported like Red Hat and Suzy. But it looks like, I think, on the new generation servers, they're going to start supporting Ubuntu. Are you guys aware of all that? Or keep track of that kind of stuff? It's kind of a big change for Dell, I think. Initially, they provided running support for a lot of their laptops. They started providing software versions of the MCF-1. And I haven't talked to them in about a year. And I know back a year ago, they were talking about trying to put that support package into their servers. But that's really all I know. I know they're constantly trying to support it because they know that it's really huge. I will say that there is a Dell project on Launchpad that helps support all of their laptops. And I would assume that the same people that are supporting the laptop are also supporting the servers if they are actually fighting up the support. And we know with the project button, that as part of George, you know, has this line that they're developing the laptop, it's really great as a support is that it's shipped with. So there's something that's something where... I'm curious, what type of issues are you having with these servers? Hopefully they'll do the same for the servers. I'm assuming all of them. Yeah, so I actually have a Dell laptop that I bought Ubuntu installed on. And I just did the firmware update yesterday or something like that. And it worked pretty well. I mean, it was one little glitch, but it worked pretty well. But anyway, yeah, so the issue with the servers is two things, the firmware updates. In the past, they distributed Red Hat packages or like Windows XE files like that. And the other thing is, Dell produces this OpenManage server administrator thing for, you know, managing the server, help the server, disks, fans, speeds, temperatures, all that junk. And they, you know, very nicely supported it for Windows, Red Hat, and Suzy. But Ubuntu, they had like a community support version of it, but it was always lagging behind and, you know, had trouble and stuff like that. So pain in the neck, but I think, I thought I read somewhere that for the new generation of Dell servers, they were going to support Ubuntu also in that server administrator type thing. But I was kind of just curious if you got, you know, the hearsay kind of thing is good enough for me. I mean, yeah. But here's a bonus answer. I'm going to see if we can get Barton George to make a comeback at Ubucan for next year. He's the project Sputnik leader at Dell. And he tends to know a lot of these types of things. So we'll get an even better answer if we can bring him back again next year. Yeah, we'll try it. Yeah, right after the phone. I just had a follow up question to, I mean, did you have to add a PPA for your firmware update over there? Because I've got a Dell laptop myself, so. No? Okay, okay. What was that? Oh, I didn't either. He nodded knowingly. I was thinking about something else for the future here. So I'm sorry to be mentally absent. But there we are. I'll be honest about it. Let's repeat this whole moment. And this time we'll all get it right. All right. Okay, the question was basically, I wanted to know for the firmware updates whether I had to add a specific PPA or not on my Dell laptop. So what I noticed when Carvent don't use it, is saying I had to create an SPI even wrong and reflash my BIOS chip off of something from the internet just to recover my laptop after that. Dell's doing it great. Lenovo, I'm looking at you for some sort of adaptive port to collaborate with somebody that's something that's adaptive on. So when it comes to how does it look right out? It's really really hard. It's been down for five years. I have the Dell XPS 15 laptop, but mine is a Windows version, not the Ubuntu version. These are really such differences besides maybe the Wi-Fi chipset that can have it so I can actually erase Windows and I can actually put Ubuntu on it out of the box and it works or are there actually like firmware or BIOS tweaks or things that were done to it? Oh, I could care less about the camera, but the touchpad is a synaptic driver, so that should be... Okay, thank you. One thing to note, well, if you're going to be running a live CD on a machine that you are unsure of whether or not it will work, it's probably best to run the latest non-development series, so today that would be Cosmic or 1810. The reason for that is it's going to have the latest kernel in it and the latest software that's available, which will hopefully have fixes for your hardware, especially considering laptops get refreshed every... We go through them faster than you go through servers typically. That would be my piece of advice. If you want to go with Bionic, go ahead and try it, but if it doesn't work, the next thing I would do is probably check Cosmic or Disco even, because newer is always better with software. Oh, then you're perfect, yeah, Bionic would be great. My rule of thumb with installing software is do your best to install software that came out six to eight months after your machine was created or inceptualized, right? Oh yeah, you're great, you're golden, so. Okay, I have five laptops, different ones, HP, Dell. I've been installing Ubuntu for many years. Lately, I noticed that the last one, 18, is giving me a lot of errors, you know? And they said that, you know, send the errors to the development department, okay. 16, I never had any problems. So, I don't know, I cannot figure out what kind of problem is that, but I'm working on the new 18, so I'm going to try to figure out what kind of problem is, but a 16, like I said, doesn't give me any problem. So, my piece of advice to you for that is that I have a feeling you probably are adding one application. Instead of just sending the report, I would probably view the report, there's two options. So, an airport asks, would you like to send this error, I'd view that error, and then I'd actually go and search for that error, or an airport might actually be, the thing about app is the same reports, so that we don't have a lot to do with bugs on the long pad side. But, what you're going to have to do is because non-hate support, right? So, you're basically asking a community member like me to fix your bug, and it's a volunteer thing when it comes to a long pad like that, so it's really, you kind of have to put a little bit. I wish you'd come to my talk about how to fix a bug and push those back, fix it back in as a purpose. But that basically, my piece of advice is to trace it back to a long pad and try to be, get involved with the bug on a long pad as much as you can, especially if you can actively reproduce the bug. Yeah, and as another start, so if you have the laptop here, and you can reproduce it, so if anyone has that, you can come by the Ubuntu booth, and we can't spend, we're not going to fix the problem, probably, if it's a bug in the software, we can't fix it. We can maybe give some advice to minimize the impact, and we can help you track down exactly what's causing the problem, so that you can, so we can point you where to either get better advice, or to let the developers know so they have a chance to address it. And that's very, very valuable to do, because if you start a program and it just immediately crashes, well, if the developers knew about that, they wouldn't have released the software. It's something like a hardware configuration or software. I mean, there's a billion things that could happen. So the more information developers have, the better they can make the program, so that's, it is hard work, but you still help improve the software. Okay, this is pertaining to the software in itself. I'm getting a call. I'm in the Q&A! I'm right here in the front! I've had hecklers, but that was different. The least imaginative. I'm millennial. I love millennial. We're not all bad. X86, are you suspending to disk? Or if you just, if you're just going straight to suspend, I have a feeling it's probably not because of the SD card, because usually when the way you do suspend is the laptop will actually keep the memory powered, essentially, and just shut down everything else. So unless you're using suspend to disk, the SD card is probably unrelated. Do you have swap turned on? It's possible that the SD card is on a USB device that is not part of I don't know, I honestly can't tell you off the top of my head, but my gut is that if everything is in RAM, it's probably not the SD card that's the problem. It's probably something, like I bet if you booted off of an SSD directly, you'd probably have the same problem. It's probably more like an ACPI problem with the BIOS. Maybe try updating the BIOS first, or looking into a kernel command line options that will tell, put the ACPI BIOS into like a Windows mode, or not into a Windows mode. It's really, really machine specific, and it's kind of a crapshoot. That's going to be one of those where you're going to spend a long time Googling. This guy in the back. Here, I got you. No? Anyone else? I'll ask another one. I'm not sure, well, let's see, how to say this. With Ubuntu 18, we ended up having some problems, kind of almost right out of the box with the Etsyresolve.com type thing. I think we Googled and found solutions that seemed sort of like clueges, but if I ever had time, I would kind of do it more carefully and say this is what I did, and where would I post that information? What's the best place to post it? To describe what I had problems with. I think I had trouble with both server and desktop type things. Sometimes I was wondering if it was our system. We even had trouble with DHCP, having a DHCP setup and getting strange results into the Etsyresolve.com. We just had so many different setups. It was hard for me to figure out which was causing problems and which one wasn't. We're setting our configurations in... This is not actually an Ubuntu problem. This is more of a system being moved. This is one of the things that got stuck because of the Ubuntu system being from off-start. Basically, everyone's going through this right now. So, unless someone has indirectly or hasn't had better information than that, and it's probably not a bug, it's probably your expectation and you're probably looking at the wrong spot, probably something like that, I would probably start on the Ask Ubuntu, because you're going to have a lot of people giving you suggestions. If you can't get a solution on the Ask Ubuntu or the Ubuntu form, then I'd go ahead and file a lot of type of things. I did this x, y, z. Because if you're looking at Etsyresolve.com, I already know that you're probably doing one. That's not where you... Yeah, I think some of the Googling I did, did point to Network Manager as being suspect somehow, but then I lost a thread and I'm going, what do I do now? If you were running a machine from before 2015, 2016. Yeah, the combination of the current system is what you've got to find. Sometimes it's going to be difficult to throw away all the old stuff and say, okay, maybe later. I know you're going to be around. I'll have to refresh my memory on it, because right now I'm just off the top of my head. What else is possible? Or Windows for that matter? Are you still having, like, service pack levels? Ubuntu doesn't strictly have service packs like that. So when we released 1804.4... 1804.2, what that actually is, is the original 1804 install media, plus all fixes, plus the kernel from the latest non-LTS series. So in this case, that would be possible. And when you install it at 1804.2 install media, you're going to have the option of whether or not to install it with the HW kernel or with the original 1804 kernel. So if you are running an 1804 box and are thinking about upgrading to 1804.2, you probably don't need to because you are probably running 1804.2. The only reason they release a new install media is so that when you download 1804, you don't have to immediately download three gigs of updates. So if you didn't intend to clarify, if you installed any versionation for any run of a daily weekly update, you will have the latest version. That's all the new media. It's just all that wrapped up. So if you're doing your daily updates, there's nothing to be worry about, and you're already on that. Next question. Carlos, you got a question for... All right. Where are we here? You got a question? Question 16. I'm referring to 16 because I installed all of them. 16.04? 16.04, 10, 11... I installed all those. Okay. That's why I'm referring to 16 because I installed all of them. 16.04. 16.04 is important. A lot of people don't know that. But if you don't post the number dot number, no one can help you. Yeah, no. But for my question or the thing that I'm going to say, it doesn't matter which version I have because I tried all of them on 16, and now I'm trying the 18.04, whichever is the only one that we have now. So... But in 16 versions, I noticed that the wireless automatically finds everything for you. But in 18... or 4... that's not happening. Always there is a window that comes and says, in order to get to a network, you have to click on a window, you know, an extra step. So I'm assuming that they develop something else for 18. That's it. So they don't want to show you 20 items and so many numbers. So what do I do? They have a network icon and you click on it, it opens the network, it spins out to the screen and just shows you to focus in on the network's available. They feel that's cleaner and you do not have one agree. But the difference is that on our side, we try to really make you look familiar to users and in fact, you upgraded from version to version instead of a pressure stall. You get the new interface, but you can on the login screen go and issue. So that's horrible yet. What you can do is we have lots of versions that have been just released, so we can definitely show you, we have 1904 right now. Not 1904 yet. So when there's a release, 1904 before that, it is only the code name. Not 1904 yet. So the code name is Cisco Bingo with a really cool max dot. 1810 is 1810 and that's the code name is CosmicCouplefish which is what you can see when the slide's on up. So we have Maxine 4 with the 1804 LPS and so we will have laptops running it. You can take a look and show directly for Ubuntu, but I'm running Ubuntu 1804 on Samsung. It's with the AMD Ryzen 5 and it claims it's supposed to go all the way to 2400 GHz but it only goes up to 2. So I was wondering is that in a Ubuntu situation or more like with AMD proprietary drivers and all that? Are you aware of the HWB kernel? Are you aware of the HWB kernel? So if you look on wiki.ubuntu.com type in HWE it will give you a set of instructions on how to install hardware enablement and stuff. There's a hardware enablement XORG stack and there's a hardware enablement kernel. I would definitely start with because you're looking about figures and you're looking at the speed of your CPU and not about the dome or the higher level interface I would probably install a hardware enablement kernel for powerful. So that's going to be like installing I think the package is like Linux, generic HWB dash 18. But what that will do is that I'll install 418 kernel I believe for causing and as 1804.3 comes out it'll roll with whatever and that'll happen like two months after this after the kernel team decides that's okay. The other problem that you're going to run into with the rising lab problem is there is a number of other problems that are still upstream that are still a problem in the mainline kernel that AMD is actively working to resolve I don't know what that list is I'm not an AMD employee but I'm aware that the cursive running bleeding edge software on Linux is sometimes you have to wait a few cycles for things to get on liner now but I have a feeling it'll probably work itself out for the time. That's back here. Yeah, good luck. Run the HWB kernel and I definitely that's definitely what I So the easy way is you can google Ubuntu, hardware enablement stack and that would be your wiki page at wiki.uc.com for the kernel and the graph that actually in HTML4.2 should have the kernel installed by default installed from 1804.2 if you were kernel if installed on original it doesn't reboot one day and you're surprised in kernel that you are here if you come by the booth you have lots of ISO in the USB key or you need a disk burn you can do all that if you give me one suddenly you can type it all out and try it before you go through all the hackles My question is about updating typically I update from the terminal and after having updated and upgraded and then updated again the software updater the GUI will come on and say you've got updates and I don't believe that to be true so I'll do the same thing in the terminal again and be told there are no updates is there a wake that I can refresh the GUI so that it knows the same thing that the terminal does but typically what happens is that for every update the illusion plays a big part in that update you never want it to be there's one update and then find out suddenly things don't work or now you have version 23 of Libra Office and it's crushed all your files minor pictures on the terminal and new updates come back not everyone gets it at first it takes 24 hours to see what happens is that when you have a list of all the packages that are available download the updates, system download the updates and look at what version you have what version is available and there's an extra number they roll a dice and what happens is that your computer will roll a dice roll as well so that if they break something they have a couple hours to roll it so you don't take out all your packages that may or may not have happened once before so to answer your question directly when do you do apps update will pull the latest software information from the computer or how do you update terminal through apps or through software center and sometimes they're using the name mirror sometimes rather than the U.S. server sometimes mirrors take a little bit longer to update what your problem is is really that in some of the two software updateers check since these are updated software the updated software basically if you click it's in check all those things update sudo apt update sudo apt upgrade then sudo apt again to make sure that you know it's as updated as it can be and then from no prompting from me the software the GUI will come on and say you've got updates so I'll run the terminal again to make sure that there aren't any updates and then the software GUI will find some updates and install them or claim as it is I'll give that a try and I want to this one's going to be fairly simple with the how long has apt the new command has been around now how long has it been around well I mean like there's apt-get and then we have apt I'm just saying that so the solution the great solution for apt-get so is there still work does all the same thing and the output doesn't change ah let's move on now two or three years ago we're going to work on a little bit more there they can change the command well I'm already here now two days ago I was running on a live CD and it's a live CD for a while now but I noticed that the system was just really bogging down I go I don't think I have that much going so I pulled up top and I was kind of surprised to find a lot of deep package tasks running and I wasn't doing any type of upgrade or install or anything so my question is this is there anything in the live package that would invoke deep package for reasons that it thought it needed to or I mean the only things I was really trying to do on the system was I was using a browser I had more than one browser window open I was using a line editor it wasn't Vim it was g-edit and those were the two primary tasks I had running I think we have time for another question actually not a question more of a comment on updates a couple of times I ran the updates and the printer system would stop with malfunction easy fix I just reinstalled cups but it was automatic it was something that I figured out myself general package update the libraries the whole deal it's just you know got into some nap time and just you know reloaded cups and boom problem fixed you can break if not stable release updates but yeah I ask you to try and put a bandaid on it launch pad eventually to start getting good all right I think we are at time may I squeeze in one I have a question for all of you how many of you currently enjoy or have ever enjoyed the Ubuntu desktop that's a pretty high percentage here yeah okay so bring that enthusiasm to the session tomorrow afternoon during this time slot because then we're going to brainstorm of how that's going to take over the world the online schedule is not quite updated the print schedule changed that happens at 4 p.m. so 4 to 5 we'll be in here and we want to have open talk about what is working with the Ubuntu desktop maybe what isn't where it should go in the future and we'll start to sort of have those ideas told us because more now than ever it's a community that really drives focus in the desktop and I say that like Anonical is not working on it Ubuntu is not working on it the desktop team is giant and Will Cook and everyone else under him work really hard it is amazingly impressive to see them as they start to work on 1804 LPI 3 weeks before 7th and 10th was out I got to go to New York and it flew me out and I got to I was a fly on the wall I was doing other stuff with my rational being there but they worked really really hard so we can start coping that up get awareness of things that need to be done and see what we can do about that so we would love for everyone here to all your friends at scale who are interested to join us tomorrow at 4 p.m. Beyond that you guys have a great time tonight and we'll be back here tomorrow our opening talk is on SNAPs from Alan Pope at Anonical and we'll be here at 10 o'clock I hope to see you then Thank you all for coming also on the X4 all weekend so thank you so much