 Okay, so let's get it as soon as it's started. Welcome to our next talk. There's no external audio, so please be quiet and don't sound or if you're hiding. You can vote for lighting talks before the in front of the B105B. Our next speaker is Jakub Ruzicka from Redhead. Hello. And there's the feedback that you can give. And if you ask questions, we can give you a nice scarf. Okay, so hello. Are you working the shadows of packaging of Redhead? Well, community distribution of OpenStack which I have this beautiful t-shirt. There are many questions about what these letters mean, but the first one is definitely not Redhead. So it's a community OpenStack distribution. We actually opened up the entire thing about a year ago. It was running on internal infrastructure, but now the whole packaging process is open. You can contribute right now as you sit here. The first step would be to come to our IRC channel and have a chat with us. But now any person can be part of our distribution. And actually OpenStack is a really big project and we are packaging lots of packages. And when I started about three years ago, it was basically one dude shuffling manually packages on the repo server. It was pretty horrible. And so from that time we moved all the things forward. We now have a continuous integration and whatnot. And my personal contribution to this process of making packaging inferior better is pkg. It is a tool I have been writing for the past three years. Nobody asked me to write the tool. I just was like doing the packaging and some tasks were repetitive and boring. Like upgrading a package, the new version would take like 30 minutes of git commands I brought so many times over and over again. So I just simply got pissed and also lost my keyboard focus probably. Okay, sorry. So let's get up with the presentation. First I'm sorry, like my teeth are reconfiguring in a weird way. So I'm barely able to speak and I wasn't able to prepare as much as I wanted for the presentation. So I hope it will fit the format and your time will be productively spent. So first the theoretical part I will show you if you are still interested in not leaving the room after this. I will show you the examples of pkg. I wasn't been able to like properly prepare this part, the practical one. So I will do what I do every day in the job and hope it will work. If it will break, at last you will see how I use pkg to fix it. So first the prepared part, that is some basic information. So actually RPM, pkg is basically a collection of packaging scripts. It is a packaging automation tools. It can do many things I found boring, like I was not willing to do them again. So I basically, any other packaging would write a script. And there are actually many packaging scripts out there. Practically every packaging has its own set of scripts. So I try to kind of centralize this. Ernie of pkg already has some users. I am the main user of course because I'm a lazy person. But I made some other people, like for example Haikelyusit, he's like the power user. And these like packaging warriors are like slaying the problems by thousands because it's immense power. Or so I would hope. It is my main project, although I wasn't explicitly tasked to write it. I thought that this tool is absolutely necessary should we move somewhere with RDO. Because the sheer amount of packages is huge. And like instead of investing the manpower into making the packages, we should invest the manpower into improving the tools and moving the tools forward so that humans are only needed in situation when the human superpowers are required. So most of the time when I was like raging on my work, it wasn't necessarily something a machine couldn't do. So I wanted to fix this. So Ernie of pkg is written in Python. I wrote it with other people's in mind. So I hope if you would like to use it or you would like to contribute, it should be a pleasant experience. If I'm wrong, please prove me wrong. And now is it 4,200 lines of Python. So that would be like lots of shell scripts probably not well organized if I choose to go by that way. But this is like modular Python. You can reuse it as a Python module. There is, I hope, nice CLI. There is even... I invented this way to describe a CL command line interface and the module interface in one declarative structure. So I also stole like the good habits, things I liked from OpenStack projects and from all the Python projects. It was kind of my playground so I tried to do it really well so that people would like to contribute to it. And there are some actually contributions already. There are some users so I guess it's not complete waste of time. So, but why? Yeah, I should talk about this. So my main motivation behind RDOPKG would be to duplicate effort because there are a few people who do like astonishing amount of work, right, Ikel? In the packaging and if these people are given proper tools, they can do even greater amount of work or the same amount of work without being pissed too much about it. So I think when I entered, it was really a chaos and I needed to check out which people use which scripts and stuff like that and there was a huge amount of like non-trivial knowledge, caveats in tools, like for example one thing that RPM, LIPRPM and spec file, spec file should be a declarative file but after some time and lots of lost hours you discover that certain commands like work in a weird way and when you swap their order, it doesn't work anymore and there is like pit files in the RPM packaging tools all over, lots of them are old so I wanted to provide a wrapper on top of this, hello to make this less pain. So we have RPM which is a legacy thing, it works for its purpose and there would be no, it doesn't make sense to rewrite it, it just works. So my solution to the problem of old code and this is to provide new fresh code on top of that that like shields you from the crap of the old. So that's what I hope RDOPKG is about. So RDOPKG has many features, basically when I was packaging OpenStack, I was packaging all the client libraries. Most of the OpenStack projects has also like the Python-project client library which contains the Python module and the command line module, that's obsolete but doesn't matter. I made quite a lot of similar packages simply and I was performing tasks on them and they were very similar so this is the core feature of RDOPKG that you should be interested in if you are packages, even if you are not packaging RDOP. These features should be usable by any RPM package and I hope to actually split these features from the RDOPKG in future to make this kind of standard because I really think this should be the lowest... The tools are working a lower level than it's necessary now so I would like to move the tool chain a step further, a level higher. So there are a few conventions you need to stick with but when you stick with I hope pretty same conventions RDOPKG gives you the power to perform the basic packaging operations quickly. Most notably there are three scenarios when packaging, when you are modifying a package and these are listed here, each one has one RDOPKG action which should help you get that action done as quickly as possible. So first and simplest is basically you just want to modify something in the spec file. So you would actually probably use RPM def spec bump or something like that, right? Yeah, yeah. So this is no big deal but RDOPKG actually we'll get to examples later. So this even with a little bit less effort but that's not the interesting part. Interesting part comes when you are introducing patches I guess I'll dive into this later during the examples. So there is the score, patches management functionality, RDOPKG introduces some. So you already have a git repository where your spec file and other files to create package lives, right? It's called this git. Federa, all Federa packages live in this git so it's a git repository with spec file and all the else. So what I don't like about it is the fact that the patches are stored there as simple files. So when a new package version is released you would just try if your spec file doesn't work with the new version, right? But sometimes it doesn't because the patches fail to apply and whatnot. So we have git already so why don't you git to do what it does best which is rebasing and moving patches around. Let's talk about during the examples. So there is this core features about patches management. There is also lots of different stuff we found that is kind of specific for RDOPKG and for example there is some advanced requirements.txt management handling so OpenStack projects chose to store their packaging requirements in requirements.txt files but for us this is kind of redundant because we are the distribution. RDOPKG is the distribution and we provide the distribution as a whole so we want to handle our requirements ourselves but most of the time they're actually right. So we want to have a way to compare the spec file which contains our requirements to the upstream so there is some functionality about that. There are some build systems front ends. We were building RDOPKGs all over in Copper, in Koji. Now we build them in CentOS build system so there were front ends for that. Then we have a metadata for RDOPKG for the packages. It's just a simple YAML file with all the information. It's kind of shady but it works. And so RDOPKG provides front end and browser for this, for browsing the info. For example I can show you I guess. So for example you can see for Python Nova Client package you can see where sits upstream, where its batches branches live, where the diskit lives, which releases are supported and who maintains them. So there are literally when I found something that would be worth automating and it wasn't done already so I searched for it and I found no solution. I put it into RDOPKG. So that's why it's pretty big now and actually I need to split it. So currently you can use RDOPKG and I encourage you. I think like SEV is using RDOPKG to package. They even have some issues open on the GitHub. More and more people are using it. So basically when is the use case when you want to use it? It is when you're chasing upstream project that is pretty fast releasing often and you need to maintain downstream batches. There are lots of projects that fit this description I believe. So when this is the case you really want to use RDOPKG but it's kind of big, there's lots of requirements. It does lots of weird stuff and it's not modular enough so I need to fix it before it gets too many users. So now is the last time to fix it because too many people start to use it and I can't change anything. So I get this crazy plan to actually split it into three parts. One of them would be Pound CLI which would be a beautiful modular CLI framework. Actually I don't know, maybe you can help me if this exists but I was trying because for example OpenStack Client which loads all other OpenStack like client libraries when you run it in command line, it takes a second. Second of Python imports. So that's a no for me. I don't want to wait a second for my CLI to finish one action. It's loading, like most of the modules it loads are unnecessary to load because just a subset so that's completely no. There is stevedor in OpenStack but it has unreasonable dependencies so I chose to create this simple schema where when you have a Python module and it's in it there is just this descriptive structure that describes its entry points like how to call it from command line and how to interact it from the other modules on top level. There is some convention again on that but it's a simple one. You look at it and you would understand it. So thanks to that only this information is set on module load but when you actually want to use something from the module then it gets imported. So there are some few nice exceptions, nice errors. I got, of course, important point is wrapping system calls because lots of the PQG commands is actually just calling system commands so it's using FETPQG and all the tools you would use manually it just does it for you. So if you would choose to drop any of PQG you just return to whatever you were doing before your scripts or someone else's scripts. So it's not like forcing itself somewhere and then locking you to use it. It's just like this extra automation thing you can but don't have to use at any time. So one part of that would be the modular CLI framework. Then on top of that would be PANPQG which is what you should hopefully might be interested in. That is the PQG management. So I will take everything from RDO PQG that I think would be useful to a generic PQG who is not affiliated with RDO project or he's just like packaging his PQG's maintaining his PQG's. So I would like to put everything in there for such a person to make his life easier. So that's PANPQG. And finally, RDO PQG would be on top of that and contain only our weird RDO specific functionality that most of you are probably not interested about. So whatever part of the RDO PQG you are interested in I hope you should be able to get it in a small package as possible. So now there is only a manifest on GitHub. No code to show you yet because it's quite a big bite and I'm going on the lines if you want to do something proper do it yourself. So I will first try to do it best I can and once I'm happy with it I will like release it and hope to other people to make it even better. So basically I'm not the first... Yeah, I'm not sure how to... Well the thing is when I was doing the Koji... I wanted a simple thing. I wanted to pass Koji a package and I wanted to monitor it. So I would expect something like these lines of code here but in fact it was like over 130 lines of hex. I'm not sure if you want to see the file. Do you want to see the file? Like how horrible the code is? Yeah, you want to see it? Okay. Yeah, so the probably the greatest thing is that the FedPQG commands class it requires non-optional arguments but they are not especially named it's just like a list of arguments. But the funny thing is these arguments are actually read from a config file and then manually split and send to the function and of course this is the gut function. It encompasses whole universe and much more and gut class. So if you actually manage to somehow instantiate it which is not very clear how to do like even the examples in the module like in the main file of the module are wrong then you need to walk out some things of it in order to work. For example the loggers are hardcoded into the class so you need to somehow set that up. Oh here, yeah. You need to tab up some random things from the gut class to actually be able to use it I have no idea like how we do. And then there is this like config no-section error return when you press CTRL C and like things like this it's really horrible and actually when I started writing ADO-PKG of course I was looking at already existing tools I was like oh my god people need it to solve this so many times before I'm not the first one trying to solve this in a proper way, right? So either that's not true or I failed my search because I also met some people who had this like this cool functionality in mind they would like to see and they were looking for the right tool where to put it, right? So my first guess would be FedPKG because I used it and like FedPKG updates your package FedPKG SRPM gives you SRPM it's pretty like simple nice thing so no, like I opened that code I tried to how to interactivate how to reuse the functionality and just no. And I also met other persons who are interested in FedPKG and ADO-PKG for the same reason because they really wanted to contribute but it is a old legacy code and I'm not like blaming it's just like it needed to be written and it's written and it's kind of working but it's not pleasant to contribute to just like for me I was looking at the code for some time and I was just like no I'm not gonna do that so my goal would be to to provide a code that is not horrible to contribute to so my goal now is once I will do the split PanPKG then would be what this presentation is about ADO-PKG is our like niche interesting stuff but PanPKG I would hope it to be like a standard of the packaging if you are maintaining lots of patches I hope that PanPKG would be the way you do it and it's non-intrusive so that should be pretty okay so once that is done yeah I don't want to force people to like look forward something that's not written and so I will write the code I will rewrite all the tools and then I will present these as like examples that it's worth it and then I want to convince the people who maintain or are interested in rewriting other tools using this if it proves to be worth it maybe I'm just like egoistic and this is complete bullshit we'll see but I think it's really better than what is there so for example I would like FedPKG, CentOSPKG, RHPKG and all these tools that are from my point of view not very pleasant to contribute to rewritten if this should this prove to be sufficient improvement so that's my secret plan you don't need to worry about that if you find it interesting feel free to go to like contact me because I'm serious I would really try to do this and I think most of these tools were actually written by a few people with enough motivation and ambition so I think if I found such people it could be done again but better in a way that other people can contribute but because this is a huge let down like how many people had this awesome feature that could already be in FedPKG and they were turned by the code like simply because it wasn't fun so I think this is a serious problem which I'd like to solve but now let's go to the examples so I wasn't really sure about the format of this so how many people of you are maintaining a RPM package in Federa or somewhere else and also it has more than one downstream patch usually yeah okay so if you would be interested I would go and start with your package just get and modify it it's not modified set up your git remotes properly so you can use FedPKG on your package and see if it helps would you like to do that or okay so so first get ADOPKG I was warned the internet is horrible but so see for yourself thanks to Margin you can simply use DNF to just copper install it so you can see ADOPKG read me for a wide variety of options you can just get clone it and Python set up by user you can pip install it but I would if I were you I would just use copper so so DNF copper enable Yeruzitska slash ADOPKG and then DNF install ADOPKG so how many of people of you are trying this right now okay that's that's not really because I would I would hate to do like interactive presentation when no one's doing it I can be blubbering if no one's going to do it sorry okay so anyway try to try to get ADOPKG if you want I will explain the conventions now which I was talking before so we have the disk kit that is a well known thing it is just a git repository with all that is needed to create RPM package which is most importantly a spec file but it might be other files such as patches and service files and whatever else is not included in the upstream interval that you need to so when working with Federa package you would just do FedPKG clone to get the disk kit if you would want to for example have your own package in copper or wherever else you could just use any git repository you want and have your spec file for example ADOPKG has the spec file in its own repo so it's kind of like it's this git and the upstream service at once that's also supported so this is the simple thing but next convention that is important and it's a simple thing but I think if all the packages would follow this it would be much simpler it's just a convention that we follow in ADO and that helped us immensely throughout the time to not get mad from all this inflow of patches so there is ADOPKG in documentation which I also wrote and there is it is explained here so imagine in Federa the latest package now when you clone it actually let me try to do that here you can see ADOPKG just thrown an error message at me notice it told me which shell command it used and what was the standard error so this is the approach I like to take I don't want to reinvent the wheel I'm just using the tools and ADOPKG provides the glue between the tools that you probably already use okay so ADOPKG clone is one of these ADO specific commands that actually did all the free slides I will be presenting to you for me it cloned the Federa this git it also cloned the patches branch which for ADO at least this is a convention project name let me show you how this looks actually this is too little screen space okay so notice there are branches and there is one branch for each open stack release that's kind of changing now but doesn't matter I would like this to be like generic so this is batches branches for Python Nova Client for our ADO package and for each release it basically let me show you it contains it's not very visible on GitHub here let's look at Juno alright so in this branch this is basically a upstream tree so this is what upstream Python Nova Client has in their repository up to a version tag so that is the upstream repository up to a version which is packaged and if I if I wasn't maintaining any patches it would be just an exactly that in that case the patches branch would be meaningless and it wouldn't exist but it would be just up to that when I want to maintain downstream patches instead of having these static files somewhere in the test git these are actually maintained as patches using git which is I think for which git is really cool so I use the right tool for the job so there is this runtime dependency on Python PBR Python PBR brought so much pain into our world like so meta, not anyway this this patch is actually downstream only it's our patch which I created and this patch on top of it use useless things instead of useless things all this is the same thing so these are two downstream patches but I could also have upstream patches so I would for example like to cherry pick something from upstream so all the patches that are on top of the the package are maintained through git so not as patches files this for example means actually we'll get to that later so this is patches branch it is just a branch of upstream with the patches with rpm with the package x specific patches on top of it this is a convention another convention also states that these must end with dash patches so if you have for example Fedora if you would be like packaging the latest greatest bleeding edge package that this git would be named master right for the old round it would be F23, F24 etc so the simple convention says that the patches branch wherever it leaves it must be named like that plush this dash patches so if you have master master dash patches will contain the patches for the master branch and so on so this is a schema for maintaining the patches in git and then it's just a matter of configuring your local repository right so that rduppg can find the patches branch once that is done it can manipulate the patches branch for you that includes mirroring the or like synchronizing the patches from the patches branch to this git so now the patches branch is alternative source of the patches this git only like has a mirror of these just aesthetic files but they're not maintained in the this git at all rduppg updates them automatically on new version or on update so these are patches branches this is the important concept we kinda it is actually emergent thing which we kinda need it we are like understaffed and we are engineers so we are you know solving stuff with as little work as possible and this is sort like emerge this just like made our work easier so maybe it will make your work easier as well maybe not but it's definitely worth looking so there is also this diagram patches branch workflow so we choose this workflow where there is upstream repo and then we just like maintain our patches branch which is basically the upstream repo with our patches on top and we just git rebase these patches when new upstream version is released which are epic new patches when we want to for example backport or introduce some patches that are for example when it often happens that something is released and two days after some super important feature hits hits the repo and so we want that feature and so we just backport it someone wants it so it gives you flexibility to do whatever to maintain whatever patches there is also mechanism I wouldn't want to go too deep into which allows to like separate downstream and upstream patches and finally pkg is the one that makes sure this git is up to date so this git contains only the spec file now after this like change but patches are actually maintained in a git branch instead of the file so advantages of this are many the greatest and obvious one is when a new version of your software comes out and you have patches in your disk git if there was some like change on the lines you changed probably the patches will fail but if the rdo pkg with the extra information it has it can just rebase and git knows it's patches and since the same git is used most of the time you don't need to do anything you just write rdo pkg new version and you just like you just look if the result is what you wanted usually it is the worst case is that the rebase is failing that means that upstream introduced the new version changed the layout and your patches are not up to date anymore so you just resolve the rebase as you would always and then tell rdo pkg to continue with its stuff so back to the presentation so this is patch is branch finally this is a so rdo pkg works with the disk git which is origin remote and second convention is that the patches branch should be named the remote should be called patches and the patches branches should have same name as the disk git they are associated with plus-patches and finally this one is not a strong one it's just for some bonus functionality when you add the upstream remote with the upstream git project then rdo pkg is able to fetch the most upstream version for you so you can for example when you have set this like this you can do rdo pkg new version without any parameters since rdo pkg I really try hard to make it configuration less convention over configuration I really hate configuration and I think that states are the doom of programming so I always evade state when I can and so if you this is the only set you need to take care of instead of having crazy configuration rdo pkg forces you to use this convention but as a result you don't need to configure anything and it should just work so this is how your remotes should look actually let me show you how to so this is the disk git from Fedora patches branch for us it's already open stack you can have it anywhere actually if you would have your own disk git like not using Fedora or CentOS or your personal repo or something like that you could even have these both in one repo you could have the disk git and the patches branch in one it should be supported and the upstream so now for example I should be able to do this okay so this is python.vaclion package it's currently at version 241 but upstream it's already at 320 so you already have some like small bits of functionality it can tell you actually if I went to this directory and I have lots of them these are the packages I sometimes like interact with I just go to the directory and do git fetch oh and this is the only like non rdo pkg command rest should be one to two rdo pkg command to solve most of the things that I ever want to solve with the package yeah so this git branch is master local batches branch is master batches remote batches branch is batches slash master batches it's various detections and it tries to be really smart so you don't have to think because it hurts and when you set up things in a way it expects it will be right if you don't do it it will just break and it will tell you like oh I don't know how to tell this parameter you need to tell me somehow or set it up correctly it should be not destructive the result should be a commit in your disk git and you just like inspect it if the work is good enough if it's not good enough you use your monkey super brain and fix it to be even better so this looks this tells me I've set it up properly so let's go finally to the examples simple things imagine I want to do some simple fix you probably use rpm depth bump spec for this so let's see how it's with pkg okay action required edit spec as needed and describe changes and spend change a lot, fair enough man so I do not like hmm what I do not like I need white space here I need more space sorry sorry sorry update is a version of the package yeah I will show you give me a second yeah good question but so it already produces change log entry it takes my credentials from git so if there is no like parameter passing no configuration you just set up your git correctly same way when you're ssh it relies on your ssh config as opposed to introducing its un variables and things to configure so always trying to use the already existing tools so it's prepared nice change well for me I actually have scripts for that it's called rpm ch like ch which puts this line into my clipboard and then I pasted that's how I did it before now rpg does this ugly part for me and I just really entered change what line which is I need more space let's assume it's actually linked to some back okay that's done yep rpg and it's done so now rpg shows you what it did it bumped a release since this is a just a spec file fix it's not new version or anything so it just bump release there are my two changes in my change log but now notice that it also generated a nice commit message it's called I need more space the commit message is generated from change log so that's one more step one more like extra sure that you don't need to do and it also quotes all the bugs referenced so if this used in certain workflow that can be very helpful but so sorry which one no no no yeah actually that's a good question so that's straight straight forward converting from package to rpm right there is no spec file on the way yeah yeah yeah okay so I'm also doing documentation I will like probably but I've seen this many times this kind of automatic conversion I've seen automatic documentation this automatic on ideally it should be automatic like everything should be automatic unless there's explicit need to write this the like perfect state I'm going but not always the automation brings good results and often it is like worse than nothing for example auto document in opens like there is lots of auto document auto-generated documentation it's like I just rewrote it manually same with the if all the big projects if it would be like one lovely place when everyone respects conventions and looks how other people do it and tries to do it as the other people do it would be great but the reality isn't as such even when opens tech which is like one like banner for all these projects and they should be all the same it's not true like half of my work here is like solving special edge cases well these people like all of the people have tax like number that number what number but guess what service also be in the beginning and you know there are these small differences here and there which makes this impossible and in the end you want most of the stuff automated but you want to have a way to do it manually when the automatic part fails it should be easy so if this package like maintained by this with this tool what you mentioned if the result is not to your liking what would you do basically if you need to yeah but so it's a one shot tool like you once generate a spec file and then you maintain it manually yeah but so that you let the tool to generate a spec file for you and you do some changes and when your version arrives you just let the tool to generate a bit which could be applied so that for example if the requirements are changed and the version is changed that could be changed yeah so most of the functionality is already in aerobic g which we are talking about but yeah it could be I'm sure actually parts of it are all over the place like this is heavy like lots of effort is duplicated in this like lots of people are doing this so I want to provide like a central way yeah so we have kind of this one template that is like made with love by our packages which we try to just copy all over the projects so that's why we don't use the generator because we already tried it but aerobic g changes the specs as a developer would like only the smallest changes and there are tools to like inspect it to inspect the requirements changes and stuff like that but I there is certain level of automation that is too much like that is my experience so I like to even this tool it can be used to fully automate the process but still it should be just a something on top not like not like something required to actually get it done so if you would like drop a new pkg and then went to just like maintaining your spec file manually it should be fine how much diamonds so this was a simple fix notice that I run one action and now the result is a comet the result of my operation is a comet and if I don't like it I just kill it with fire everywhere so that was fix next that's boring really but it's still better than rbm dev bus like I think like slightly okay so now when your interesting part comes so let's see there are no patches here there are some patches so this is f22 branch of python unclean package so this is federal 22 package basically that you can know if you're using federal 22 alright so the patches branch is stable Juno this is some area pkg auto magic like this is magic but it can't tell so let me show you how does stable stable Juno so there is two package again remove frontend dependency change something somewhere these are two patches I would want to introduce a new patch from upstream under this branch so I'll switch to stable Juno then I'll just cherry pick the patch from upstream which should be upstream master which one I don't know for example document this parameter oh yeah it applied great so now I have this extra patch on top of my patches branch and I have error in the presentation alright so I run one I added the patch to the git that is a thing of your workflow so actually we used the workflow that we used the workflow where there is some workflow that gives you that gives you the patches to your patches branch so you could have some garrits encoder if you could just push the patch as you want if you're like the master package you're too awesome for processes you can just push them that's your thing but once the patches are in the patches branch and erdyo pikaji knows where it is it can just die married so I just cherry picked I just get cherry picked the patch I wanted and then I run one erdyo pikaji command and this is the result it found the release it generated the patch it added the patch it even added the patch apply options it can also understand the modern ways of applying erpy and patches like through git apply and the other one which I don't even remember it generated nice change log so done introducing the new patch should be done when this went on imagine I would like to imagine I would like to also reference pakzilla so I'll open the spec file and I don't know this is stuff erhbz 1, 2, 3 and this is erhbz 9 erdyo pikaji amend it's basically like git commit amend tsh a but it regenerates the commit message from the change log so this commit message looks exactly like the change log does and so finally this shouldn't be updated with patch you seriously need to fix the presentation I will actually do it right now because I haven't used it for a while actually okay finally the most useful action I don't know if I will be able to do a proper example let's try actually so this is the most useful one and this action took me most time when I was doing it manually this is when the new version of the package is released so in the lucky case when you have the patches stored in this git as flat files you can get lucky and the patches still apply even on the new version so that's awesome and you don't need to worry about anything but when you have like 40 patches like certain projects the chances are that some of them will not apply so you need to git rebase so let's do it anyway so so let's see so I don't dare to update to the newest version but let's try 220 alright so now the package shows you oh this is not very helpful it shows you again for every shell command it runs it shows you the command so if you would want to like Paranergy or pkg with fire and never use it again you can just like copy this command and use just this command so it basically does what I was doing where I was doing this action which is I looked at the diff okay what changed yeah lots of stuff changed oh no good also it shows you which requirements changed between the version and the requirements of the SPR so that's a really nice feature you can see that some new requirements are bumped so this is possibly what you were talking about kind of right okay so now you can see what it does if the rest is the local package branch it checks out it rebases it and now it this could be a destructive operation basic inner workflow so it asks me if I push the new version you can also run radio pkg in local operation but by default it assumes that you're working with old repos it's something to consider to change in pkg so I don't want to push this no it uploads new it even uploads the fpkg uploads to look at sidecache yeah oh and my certificate now expired so we can't yeah let me see disable that can be disabled yep no new CRC so so now I don't like what happened so I'll just do radio pkg info it gets giddish no it's not it tries to make me kind of a gidd interface so here you can see you can see what is running, what command what are the parameters, what are the steps these steps are actually ought to be adempotent so you can define the actions in radio pkg that's the a CLI structure declaration I was talking about it actually declares a series of steps and if radio pkg breaks in any one of those steps when you can just radio pkg continue and it will rerun from that step so that way you can do transaction-ish things you can drop to shell and let the user fix it using these tools and then resume the operation instead of like trying to force everything in your awesome tool so I found this very very useful not sure if it really is but it's like that so I can radio pkg about this like you would in the get yep now we should in progress let me try to rerun the action without pumping the sources alright it worked actually, amazing okay so let's see all that happened here so it showed me the diff, the requirements then it did some magics then it did rebase here and actually if this rebase this is what you would probably do manually just literally the same thing so if this rebase would fail you would just get to solve the rebase as you would when not using radio pkg once it is solved you would do radio pkg continue and it would become from this and continue like it did so it updates the spec file gets overseas is not used so there is the essential part it's called update patches from historical point of view it was a script a packaging script and now it's part of radio pkg this actually this is the thing the entity that introduced the patches branch like when it came to existence it's because it was hard coded in that script like the patches and this is the script that actually exported the patches from the get patches branch to the as opposed to having them in this get as flat file so this is the biggest change and it all revolves around that and the possibilities this gives you this basically means you can use git on your patches directly so you want to do that anyway so it tells you there are two patches on top of 220 0 are excluded you can exclude some you can now this is actually external patch someone provided this there are some problematic patches so some project need to filter out the problematic patches which should be excluded this is also now possible thanks to external contributors it shows you the patches that are on top and then and then it does all its funny stuff and here you can do the diff here you can see the final diff so it changed the version to a correct version it reset release it removed the document search of the parameter that's because I made the change locally from my local patches branch and the patch is not in the actual upstream or like so it regenerates all the patches so even though I added some like random weird patch here that shouldn't be there at all it didn't care it just deleted it and it mirror the patches that are in the patches branch so yeah there is a change log updates to if I for example want it again to be more verbous in my change log I would just do I don't know some amazing feature again I can do rdupkg and it updates the last patch file so after running an rdupkg action you have just a new commit on your test kit is this commit with everything that you would need to do for a long time to achieve it depends so this is the core functionality I'm not sure if I have oh yeah there is one more thing to show which only yeah last thing to show you the advanced requirements management so there is lots of manuals rdupkg has multiple manual pages the manual is online there is rdupkg documentation should be kind of recommended so what you can do is actually yeah you can use various actions to inspect python requirements so this is some bridge between python and rpm requires you can see what was added changed removed just like quickly look at this I think you get what's the point you can also query the packages across the rdo this is rdo specific you can you can see what versions of packages are in which repos this is also quite useful and you can this is the most brutal action you can actually query for entire requirements file and see if these requirements are met across the distribution so that's about it there is much more like rdo specific functionality that came into existence now it's not needed anymore so it's absolutely maybe it will be one more and so I would like to make this what I just show you the patches management functions I would like to make this into ppkg that is why it should exist that would be its main like should be fairly small module just to do that should be nice to use and so if you if you like what you've seen definitely definitely join me on that astonishing revolution happening so rdo pkg is using github I'm okay with the issues at last there it's not as slow as Buxela so so if you want something in rdo pkg I do use the github issues I really read them if it's a bug that like I think it's breaking things so not rfe there are lots of rfe's but should it be bug I usually fix it within a week or so so I hope to be a really fast upstream I built rdo pkg in copper so it's not a problem for me to build a package immediately once I get the fix I also I do a poke driven development so if you want me to do anything you need to poke me because lots of people are wanting lots of stuff and lots of emails are sent and lots of meetings are held but I'm not interested in decked crap I'm interested in getting shit done so when someone really needs something he pokes me so if you really need something from rdo pkg from me just poke me I'm on the rdo IRC channel always happy to help it's my like it's my child and I really like it maybe it's not maybe it's not the greatest project ever but you know it's mine so that's it that's finally the end I can barely speak more so thank you for your attention I hope this was useful at last in some regard to it's actually very it's a very specific topic not many people are interested in packaging and those who are having their scripts you know and so it's very few people and so I think since it's so few people we shouldn't duplicate effort I think it should be like possible to settle on something like this and move forward together like duplicating the effort for me I for example found the federal federal update process to cumbersome like I understand that like federal as any distribution wants some degree of stability so being able to update packages too fast is really against that but for me the process to it's just like not cool it's not a pleasant thing for me as opposed to building pkg in copper that's easy I just like run my actually I have pkg action to update pkg so just I like pkg inception and it's all done and I don't need to do some weird processes and stuff like that so if you're building some like maybe this is also for you so that's all from me and finally there is a meeting after this presentation some cool folks from our team I don't know sightseeing, burnout, having beers so if you're interested in hearing more from me or from them feel free to join and that's it thank you for all your attention yes your questions right now because where so if you have any questions ok and the only person who kind of interacted gets the questions ok it's very important that you have something that you want to do with that if you don't there's a few of them so if you have something that you want to do if you want to have something that you want to do that's it so ok that's it ok if you have any questions do you have any place we'd like to work if you don't, then there's a joke that we can do so if you don't there's a plan in a few minutes will you have a class And it's not like it's a commercial XTB. I just don't know if you're going to get it, I don't know, but it's a very good experience. I've just been thinking about it, it's kind of." And I think you are going to have a very good experience. You were truly very good at it. I'm glad to have your Enum there. I think it's quite interesting. The small one. It's very interesting. I had it in a workshop. It's a little bit difficult to handle. It's a little bit interesting. You can't handle it. But sometimes you can. And I think it's quite... That's why I've divided it into two. It's a completely different story. For example, Furt and Furth are new people in Rust. So Furt will find someone for whom it's completely new. And then I did the Advanced Package Packaging, where there will be one advice for the second advice and tips. It's something in the documentation, but when you look at it, it's enough and I go on. You never come back. Do you have a presentation? I have a presentation. Did you record yourself? No, no, I did it completely new. I don't know who I am now. It comes to me that these are things that you are already looking at. I have also overcame the beginning or the middle, because when you learn to play mobile, then you look at a lot of things, you have new things like gestures and so on. Why do you have to? Or do you want to? But then there are things that you never needed. And you don't even need them, but when you learn about them, they are good. Slides Saturdays. You don't have Slides Saturdays there. I don't know what you are talking about. There is no food, there is no food, and there is no food in front of you. Good. Put it away. It's safe to take it away. Yes, yes. So for today I'm ready with the bottle. So. How are you? Yes, wait. How does it work here? Where are the screens? So it's a notebook. Yes. And what's there? Yes, you can see it. Yes, it's here, and at the same time it's here. What you see here, it's there. Yes, yes, yes. So it's easier to see here, so let's go further. F5. So. What else is there? Yes, yes, yes. So I'm going to add slideshow, slideshow settings, multi-pod display, preferences, DP. It's a little bit off, but it's fine. You are both 2 and 8,000. Is one screen enough? Yes, one screen is enough. What kind of settings is it? Multi-pod display. What do I see? Screen. Dopa, dopa. No, no, no, no, no, no. What do you want? Multi-pod display. This is the new restaurant. So I'm going to add this. I'm going to add this. Is it here? Yes, it's here. It's here. Yes, it's here. You can see it. Yes, it's here. You can see it here. It's not a lot of shooting. No, it's not a lot of shooting. Yes, it's not a lot of shooting. I came to a conference. They had to show me this. What is it? It's like... Yes, yes, yes. It's useless. It doesn't work for anyone. It's all windows. It doesn't break. Do you think so? How did you meet this guy with the windows? Yes, it works. I'm glad to hear that. Nothing happened. It's a mess. I didn't do anything. We had to go to a conference. We had to go to a conference. I have a meeting there. I'll talk to you later. I have a meeting there. No. I need one file. One such as the one you have here. The one you have here. It's a piece of junk. I'm not going to use it. I'm going to use it. It's time. No. You can't do that. No, it's not like that. Yesterday we were doing it. I don't know what you are doing there. I don't know what you are doing there. It's time to make a decision. I am a person who is a good person. So it's not like that. It's not like that. It's not like that. And so what? At the end of the day you don't need to do anything. You are not going to do anything. Do you know why you can't do it? Because you can't talk about it. But you can supported with it. I have thought there will be something different about you. You do it. You can't have it. You have something different about you. You have something different about yourself. I have thought about it but I should have thought about it. It's not like that. It's not like that. You have decided to... I was just waiting for some feedback, I think it's been a long time since I've been working on it, and if I can continue to work on it, I will be able to do it in the future. I've been working on it for a long time, but I don't know how it works, I don't know how it works. The first thing I understood was how it was, it was always a dream to be in a restaurant where we would have a restaurant with a price of 180,000 euros. It's here, it's here, it's here. It's here, it's here, it's here, it's here, it's here. It's here, it's here, it's here, it's here. Do we start in half? Yes, we start in half. 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 10, 10, 11, 12. I think we should finish in the next 20 minutes. No, no, no, I'll start in the next 20 minutes. So, in the next 20 minutes. Yes, in the next 15 minutes. So, in the next 15 minutes, Well no, I can't say that. Ah, so that's a simple question. Which is close to it? Which is not close to it? It is not close to it. That's it. It was not close to it. Yeah, it was. It's not close to it. I'm confused. I can't even guess.