 Okay, hello, welcome everybody Ralph training he's here to talk about it does and what was new what's the future and See what what what he has to say, okay better. Yeah All right, so I would like to tell you something about the things we have been doing recently in our research group I've put it was in the title Well, it isn't called it was any longer For some time now, but I will come back to that a bit later So first of all, this is joint works have been more people involved in this and this work has been done with C3 fine Gentleman Pietro was already mentioned in the talk by wookie. He is the main author of our library and of our tools Robert de Cosmo He was behind the two European research projects that we had and he is the founder and leader of the aerial research lab I'm currently working in Paris and of course everybody knows suck our beloved the beyond project leader who has also been participating participating in this in this research. Okay, so This is the starting point I won't talk much about this because this is things we have presented a lot of times and Forstam at that conference or on just as a quick reminder So our starting point is what we have been doing some five or six years ago And this gave us a tool, which is today called either step check. What does it do? Well, it takes a package repository like the packages file that you find on your favorite devian mirror And it looks just on the relations between binary packages dependencies conflicts provides and so on and it tells you for each of the packages in this repository where the package is in theory Installable that is whether it is possible to satisfy all the dependencies and conflicts with packages That are present in this repository. It does this by looking only at the metadata It does not try to install anything. It doesn't look at maintainer scripts. It does not look at code at all It just looks at the package relations However, it uses a complete solving algorithm And this is one of the main features that this distinguishes it from aptitude or from apt because it uses in fact a set solver Which is behind and which actually finds a solution every time a solution is theoretically possible And we have a fast implementation behind this which comes in several flavors one flavor is for the beyond another one is for rpm And in fact, we have today we have a generic tool called it was this check Which also works for different related components more models like OS GI plug-ins Okay, so why did you continue to work on this because this is something we have been doing some five or six years ago Well, this gives part of the answer In fact, we are running our tool on the debian distribution for all the architectures and for all the areas Every day and then you get out of it something like this This is the table of resolve the results we get for each of the days so the lines are the different days columns are the different architectures This is only a part of the results and in fact, it's a bit old It's from September 2011. In fact, when you look at the figures figures of today, it's much much better This is probably an effect of the freeze. So packages are much more better behaved today. So we see here for instance for Intel Some 400 packages which are not installable for amv also some 400 and for the less popular architecture It's even more packages which are not Installable and the problem is well if you would click at on on this figure here You would get the table of all the packages which are found not to be installable with an explanation Why these packages are not installable? Well, the problem with this is imagine that you would try to start finding buck reports a problem Reports based on this. It's just too much. There are too many not installable packages So before one wants to do something with this data one has to look more precisely Why it happens at all that packages are not installable and you have asked yourself to ask yourself the question Whether we at all should worry about these packages which are not installable So why might it happen that the package is not installable in set? Well, there are some easy cases One easy case is quite obvious It might be a transient problem This is which is just due to the fact that a maintainer has uploaded a package and all its dependencies are not yet Available for this architecture in this case the problem will just go away in a day or two Nothing to worry about and these cases can be found very very easily just by looking at the age of the problem report Which is found by our tool a? Second source of non-installable packages which we do not have to worry about is this one We have architecture all packages and with our current infrastructure these packages will be propagated into all the architecture packages files and It may happen that in fact these architecture all packages are intended only for some architectures And in this case usually it happens that its dependencies which are architecture Specific may just not be available and in this case we find architecture all packages Which are not installable on some of the architectures nothing to worry about and then again This can be filtered out quite easily just by looking at the architecture file of the binary packages however, there are other cases which are not so easy and Some of these might be cases where we can say where we can pinpoint a location where something has to be done And these cases are well It might be the case that the package p depends for instance on some Non-installable packages or that it depends on packages which are in conflict with each other at the moment and Then the question well first of all we would in this case if this happens our tool would report a package p to be not Installable because this is just the claim to fame of this tool that it does a deep analysis of all the possibilities to install a package And it would find out a problem if a problem happens now Well in this case it might be the fault of the package p or it might be the fault of one of the packages that p Depends on and this is what you would like to find out It would be very nice to have a tool which could tell us in such in such a situation Now we have a package p which is found to be not installable who is to blame who has to work on this package and fix Something in order to make it installable again. Well, this is what we would like to have Ideally however, this is of course a goal which in this way cannot be solved by an automatic tool Of course not because if you would like to find out who's to blame Then we would need some understanding Why is there is a certain dependency why is there is a certain conflict and how these can possibly be changed in order to Make the package work again an automatic tool which just looks at the dependencies cannot solve this only a human could do that Maybe we could do it with this machine learning, but this is Something which we are not doing at the moment. However, what we can do is we can in a subset of these problems Pinpoint the situation Where there is a package and it is precisely this package which has to be fixed No one else can do anything about it and this is the situation that you would like to detect and first of all I will give you a precise Definition what this means this precise definition is completely theoretical It's not effective and then I will try to explain you how this definition can be turned into something Which can be recognized effectively effectively by an automatic Tool, okay, and this is the idea So when I talk about Pn this means that I have a package with a name P and a search a certain version which is n and I'm looking at a situation that I have such a package Pn Which is not installable with respect to the current repository which I call capital R Now the situation where I can say well it is peace own fault not to be installable is The following situation No matter what all the other packages and the repository do if they move on to newer versions If they change their dependencies that change that provides that change their conflicts, whatever No matter what all the other packages do this package remains not installable if this is the case Then I call my package outdated So this is our definition of outdated because then I know that this package It is not installable and only this package itself can do something about it and can fix its dependencies Conflicts in order to become installable again So why does this make sense? Well, this makes sense of course only because we may have dependencies and conflicts with version constraints And this you will see in a second and the examples Okay, here are some examples these are artificial examples, but at the end of the talk you will see some Examples that I got out of my tool well first of all this is an example We are just looking at the question whether a certain package is installable at all and and the second in the next slide We will look at the question whether it is outdated in fact So what do we have? We have here a dependency package through version one has a dependency which consists of two alternatives and In fact, when you look at this for a few seconds, you will see that this of course is well In fact, it's quite obvious that this package is not Installable why well we have a first dependency which is either best in version two five or bar Inversion precisely to three and we have none of them and the depository So we have bar in version two best in version two again and this is of course not Satisfiable at the moment so in this case you would find that package through in version one is not installable Now for a precisely the same set of packages is food one also outdated and The answer is no no, it's not why not because well even if food one is currently not installable What may happen is that for instance bar is advanced advancing its version to version two three So this would satisfy this dependency and best say is advancing to some version which is greater than two but Well, in fact, it's not necessary. So best can remain at the current version Just package bar has to advance to version two three in order to make it installable So in this case the package is not outdated Now let's look at this package five which is slightly changed in what has changed here are just the version numbers and the conflict And here we now have a case of a package foo, which is in fact outdated It can only be fixed by changing the dependencies of the package foo itself No matter what the other packages bar and best are doing they cannot make this package with these dependencies Installable why okay? Look at the first line here. You need best of version two five or bar of version two three Okay, imagine you try to have best of version two five best of version two five It's already here. That's fine. However, when we have best of version two five, we cannot have at the same moment best of a What's what's happening here Did I did I mess mess up the Okay, yes, of course. Yeah, thanks. So because we have already version two five We cannot have this dependency satisfied So we have to satisfy this one bar greater than two six which is in conflict to what we have here So this is not possible However, if we say we satisfy the first line by bar in version two three, which is already here Then we cannot satisfy this one So we have to satisfy this one and this is not possible because we are already at version two five We cannot downgrade this package in the repository So this is the kind of analysis that we would like to do automatically this hour with our tool All right, so now I've given you some intuition in order to make it more precise You have of course to define what precisely it means to move from one from one repository to a future version What precisely may happen when you move on packages in your repository? So what may happen packages may be removed Packages may move to newer versions, but they cannot be downgraded and the repository in fact This is the only thing which cannot happen If packages are advanced to newer versions, then we assume that in the process They can change their package relationships in any arbitrary way. So this is of course very very good However, this is the only assumption that we could make There is some leeway possibly here to improve the analysis by giving a more precise more narrow definition of how these can evolve But this is open open question for me at the moment. So if you have some ideas about this I would be very keen to discuss this. Okay, so packages can be removed They can move to newer versions when packages are changed. They can change their relationships in any way You can introduce new packages and this means there are of course many many many ways how a repository can change and Well the consequence of this is that are potentially Infinitely many ways how the repository can change. So this is of course a problem if you won't would like to make this Automatic one remark at the moment. We are assuming that the binary packages in the repository are moving independently of each other This is of course not true because as we know packages are uploaded by source packages But I will come back to this a bit a bit later Okay, and now we are transforming we are taking this definition Which is now completely precise but not effective and try to turn it into something which is effective So the first observation is well the first question one can ask is do we have to care about package removals from the repository and The answer is no. We do not why? Well, if it is the case that a package is not installable in any future where we haven't removed packages Then it is never installable in any future Why because the removal of packages from the repository cannot fix in solubility. It cannot fix anything So this means that Removals of packages are not relevant for us. We can ignore them when we look at all the potential futures of the repository Okay, next question. What happens when we introduce now new versions of Packages a priory these new versions of packages can change the relations in any arbitrary way Do we really have to care about all these different possibilities? How new packages can change their relations and the answer is no again Because for the purpose of this analysis, we can just assume that if we install if you have new versions of packages Then these new versions of packages behave as nicely as you could imagine They have no dependencies nor conflicts and they just exist in order to satisfy dependencies of already existing packages and the reasoning is well if a package is Installable in a future where all the other packages are as nice as one can imagine Then this package is also installable in all other Futures which are potentially worse So as a result of this we obtain that we can assume that if we introduce new packages in the repository Then we have no dependencies nor conflicts and this is since this is all we care about This means that we can assume that we know precisely how the future versions of the packages look like Because we just can say we have no dependencies and no conflicts. Okay, so this is nice Let's continue We can also introduce new packages with new names into the repository. Do we have to care about this? Yes, in some sense Because it may be the case that currently a package is not installable because it declares a dependency of something Which currently does not exist then obviously when we introduce this package in the repository the package becomes Installable so we have to care about potentially new packages However, not about arbitrary new packages because new packages can change something only if you already have someone who expresses a dependency on it So we have to care about new packages But only about those which names that are already known which exist already in the dependencies And this is a set which is known a priority in advance and which is finite Okay, so this is nice again So to wrap up what we have up to now We only have to look at a finite set of new package names the ones which are already expressed in dependencies We can ignore package removals and We can assume for the new packages that we know exactly what they look like no dependencies no conflicts However, there remains one problem That is the version numbers and there is of course an infinite Space of possible version numbers that future versions of packages may assume and this is the remaining problem that we have to look at and The way how we solve this is expressed is explained quite simply with an example So express assume that we currently have a package P and the repository is say in a version 5 And now we look at all the other dependencies and conflicts. Yeah, I'm sorry. Yeah, I'm Wondering if you can explain again Why we Why it's okay to assume that new packages don't have dependencies in complex. Okay, I understand that. Okay, I'm sorry So this was This argument, okay, because we are interested in the situation that in all possible futures a Package is not installable. This is the situation that we would like to recognize Now in order to recognize this it is the fifth It is sufficient to look only at those futures The new packages have no dependencies and no conflicts because when they have dependencies and conflicts It makes the installation problem just harder If it is already impossible where all the other packages behave as nicely as possible They don't have themselves any dependencies and conflicts and They just sit there in order to satisfy my dependencies and if even in this situation and so the installation is not possible Then I know it's never it's never possible So that's the reasoning. All right, so The problem that remains is the infinite set of possible versions Okay, let's look at this example We have currently package P in version 5 and now I look at the all the dependencies and conflicts Which are expressed in the other packages in my repository and say I have to have dependency on P in Version smaller equals a 9 and P in a version different than 12 Now the versions that I would have to look at are the following ones I take the word package, which I already have in my repository version 5 and I would introduce future versions that are mentioned in these dependencies 9 and 12 This is of course not completely sufficient because it may also be interesting to look at something which is between These numbers because otherwise I would miss for instance packages that would satisfy some of these Combinations of constraints. So in order to obtain really all the possibilities I would also introduce these red numbers which are just intermediate points between all the version numbers Which are mentioned plus one which is bigger than everybody and I know I don't need anyone Version which is smaller than everybody because I'm only only moving forward in the future and never backwards Okay, so now in theory. We are almost done So we have a finite number of version numbers that we would have to care for and which would be Representative for all the possible futures that we can obtain well if one looks carefully one can be a bit better and compress this list a little bit So here's again what we had on the previous slides So we have currently version 5 and we have these two constraints and we have identified this list of versions Now what's important about these version numbers is whether they satisfy or not Certain constraint that we have here smaller equal 9 or different from 12 and when you look at these You will find that there are two pairs of versions which behave exactly the same with respect to these Constraints in theoretical computer science. This is called an observational equivalence which means you have a set of a set of observations They are here and you identify things which be which do not for which you cannot observe a difference And these two pairs are 10 and 13 Because when you take 10 and 13 they both do not satisfy this one and they do satisfy this one So you can identify them and adjust keep 10 arbitrarily and Also, you have six and nine which behave the same because they do satisfy this one They also do satisfy this one. So I just keep the version number nine Okay So are we done yet? In theory yes in theory yes because yeah, why doesn't five fall in with six and nine there? What why five seems to satisfy the same as six and nine six and nine you have smaller and equal here So six and nine they both do satisfy this constraint, right? And they're also both do not satisfy this constraint, right and five seems to be in the same category. Oh Okay, because five is in fact the package that we care for that This is the package we currently have in the repository and this is the package for which we would like to know in the end whether it is installable or not and The other ones are future versions the future future versions which are introduced here are just future future versions That I introduce for the sake of the analysis Yes, okay, so you're right. Okay, so this observational equivalence must be applied only to the future versions Okay So what what do we get now? We have For every package that we have to care about a finite set of possible future versions So in theory we could now look at all the possible combinations and construct all the possible futures And this is of course completely infeasible Imagine we have thirty five thousand packages in in Z it if you only had two versions per package This would give you two to the power of thirty five thousand possible futures and believe you believe me You do not want to to compute with that Okay, so this is completely impossible even if in theory it would give us an answer However, the way out of this is quite easy Because what we can do is we take now all the different versions of the packages that we have constructed Plus the ones we have already and we put them together in one large repository This would be a repository which contains several versions of packages But this is something which is perfectly allowed and we look at this big repository you which contains everything Now this is of course much much smaller in our example This would be a small multiple of thirty five thousand and this is very very feasible for a set server It of course it has to be justified that this gives us exactly the same result This respect to insolubility, but this is the case and this needs a bit of of reasoning But intuitively it's quite clear because what we are looking at are all the possible installation sets that are allowed That are licensed by any of these repositories and all the possible installation That's that we get from this large huge repository Is exactly the same set as installations at the ones that we get from all these possible futures that we have here So we just lumped together all the versions of the packages and we look for insolubility in this large set of repositories now this would be almost the solution if I wouldn't had if I Hadn't made an assumption at the beginning and this assumption was that all the packages are moving Independently of each other's which is not the case in reality as you know that binary binary packages and the repository are advancing in a synchronized way Because they are coming from source packages and then we upload a new version We upload a new source package and in our for the purpose of our analysis This is like introducing in a cluster new versions of a complete cluster of of of of binary packages Okay, so this has to be taken into consideration that is when we look now at our Large universe you which contains all the future versions of all the packages Then we have to exclude all the installations which would mix versions Different versions of packages coming from the same source and this is not difficult to do what we do is we introduce conflicts and The trick how we did is in in our in our tool is well when we have a package of name P and Version N and you find that it comes from a source S called S. So he would just introduce a provides This package would provide a package that we call source S of version N And we would say that this package is in conflict With the source S the same source but of a different version And it is where we would we would avoid that we install different versions of packages from the same source But this different with different versions There is of course I have cheated a little bit I've cheated cheated at two places first thing is we do not really have version provides in the beyond However, then we are doing this analysis. We are no longer working on the dbn format We are working on a different format, which is called CDF which allows to have versioned provides first thing second thing is Well, this works in case We have binary packages coming from a source package and all the binary packages coming from the source packages Package have in fact the same version This is not always the case It might be the case that a maintainer has decided to have one binary package following a completely different version scheme Than the rest of them in reality our analysis is a bit finer It just looks at clusters of binary packages which have similar similar version numbers, and I see that several questions David Ansgar, maybe do you have a no David once you introduce version provides then You can actually have packages going backwards in time because you can introduce a new package that provides an old version number of another package So your analysis ends up changing once you allow version provides because you can't guarantee that the version numbering is monotonically increasing I Do not really follow So, okay Version package foo is that version three excuse me excuse me of course I'm assuming that this is a prefix which is not which which isn't yet used So no one is is depending on something like source source s right no So I don't I don't have a problem with this particular step I'm just pointing out that if your model does allow for version provides then some of your previous assumptions It for instance if Debian was to suddenly start allowing version provides Then some of your previous assumptions don't hold Because you can introduce a new package that provides a lower version of an existing package Okay, good remark good remark In fact, we are not starting from a model in which we allow version provides We are starting from the Debian model and in the Debian model We just we only have for Debian policy allows But then we translate it into CDF and we are doing this because our tools are in fact using are using the CDF format And only in this transform model. We are introducing this these these version provides Okay, there are some there are some other things one has to look at for instance It might be possible that you have a source package and the binary packages generated from the source package Have almost the same version number So it might be that one of them hasn't has an epoch for instance, or it might be the case that one of them has a bin emu suffix, but this can be easily dealt with All right, so and this in fact is the solution. This is what our tool is doing now Let's look at what one gets when one wants this tool on a set repository Yesterday, I did it on on a reason to the repository And that case it was quite boring it didn't find almost it almost didn't find any Audited packages this again, I think is due to the freeze So all the packages are currently quite well behaved and for this reason I took an old analysis from from October of last year And that case we had in sit main on ice we see 86 we had some 35,000 binary packages and The classical it was step-check tool would find among them 431 which are which are not installable Now we are running on this our tool which does this outdated analysis We are adding new future versions of packages because I'm dummies and after adding these dummies We have some eighty suit eighty two thousand packages, which is quite a low number So our tool runs it runs for some two minutes on my on my laptop and it finds amongst them as them 119 outdated packages in the sense. I just have explained now Let's look at some of these what we get and well we find among among these 119 some 60 which look like this and you of course all when you are debian developer You have seen these things. You know what this is this was the parson transition So you this is in fact the kind of explanations that we get out of the tool You find in this case that the package with this name and its version number is found to be outdated And the reason is that it has a dependency Which cannot be satisfied and this dependency is on a parson version smaller than 27 Okay Now I know there's a parson transition going on and this means I would of course like to ignore all these Reports which are coming from the fact that there is a parson transition going on. How do I do that? Well, I could filter it with scrap or something like that But the proper way to do it now is I just add to my repository a dummy package Which just provides parson to see two to six inversion Well, but parson inversion to six and I won my tool again when I want it again I get now some 50 packages which are found outdated and I look again at the results that I get out of this Well, I get something like this No, this is really really an old parson version. So this really this really is a bug Okay, and then I think the bug was already reported. I don't remember if it was not it I certainly did report a bug on that, but this was the only one which was still found undecided Uninstallable due to a parson dependency, but there are others So we have yet some cases like this in this case In fact, a package was not installable because the maintainer had in fact hardwired into the dependencies Dependency on a package which has already advanced. So here we had a dependency on esterix And we had at that time already a version of esterix, which was newer than 118 In this case, this was hardwired into the control file of the package And this was in fact a bug again. This was a bug which was already known at at that time This looks precisely the same. However, the nature is different. Here. We have again a package With an unsatisfiable Dependency and of course we have been used in the in the archive But been you've been used in a newer version like this. However, in this case, this was a dependency which was Filled in at build time and in this case where this is kind of Of a problem for me if I would like in fact, I would like to file automatically Buck reports out of this. However for these cases where the dependency is just filled in by compile time What I should do is of course a bug report against FTP No, it's released and to ask for for rebuild of them of the package Which I did in this case, but I had to find it out manually by looking at them at the at the package and this finally is an interesting example and It let's look a bit at what happened here and for the first reading please ignore the blue lines and just look at the Black lines and at the red lines. So what happens here? We have a package which is called Cyrus admin to do to to which is found to be outdated and the reason for this is that we have two dependency chains going out from this package To packages which are themselves in conflict with each other These two conflicting packages are package one and package two here. So we have package one which is Cyrus admin to fall and So it was admin to to well, this one is precisely the same as our root package. This one is in fact Obtained by following a dependency chain And here in Cyrus admin to fall we have a conflict this Cyrus admin to to so we have a conflict and what we have below here This explains how we arrive at this package. In fact, we have a dependency chain Going out from Cyrus admin to to our root package And there we have a dependency on Cyrus admin to fall and there we arrive at the package Which is in the conflict this with itself. Okay, so clearly this is not installable however According to what I just said, it's not obvious that the packages outdated. Why we have a conflict But the conflict is expressed in the package Cyrus admin to fall So in theory if the package Cyrus admin to fall advances to a newer version Which drops this conflict the problem would be solved However, this is now where the blue lines kick in the blue lines tell us that both packages are Coming from the same source. They're both coming from Cyrus. I'm up the to fall which is a bit strange But I think these are backward competitive compatibility packages something like that Okay, but both are coming from the same source version and since both are coming from the same source package They cannot advance independently of each other and in this case We know that the maintainer of this package Well, in fact the maintainer of the source package has to do something about it And he has to fix his dependency in order to get it right I think there was some to say try to do some transitional packages between versions and say got mixed up with the version numbers and we filed a bug report about this and it was fixed immediately Okay, now to wrap up Well, first of all, I should say something about this so this is a line of research Which started a long time ago in 2004 we started with a European research project Which was called Edo's The Edo's project is over since more than five years But somehow this name has stuck with us and we are still using it as a kind of a brand name because it's now quite quite well-known So Edo's was the first project on the analysis of package of package relation Well, where we worked on different aspects of quality issuance of packages Then we had them in cruise European project. This is slightly different focus. It's all it's it's also over now And we are still working on these tools. We are still maintaining these tools However, we now have changed the names. We are calling them now doze. So it's no longer Edo's It's called doze now with new implementation of all our tools. We have the tool. I just presented to you It's called outdated and you find it in the DB and package of the name doze outdated And just a brief note of advertisement. We also have a new version of depth check or this check This has been a once in so long times the old version We now have a new implementation which is much better which has a modular architecture Which has a nice documentation and which is multi-arc aware So for we see we are still keeping both the old Edo's version and the new doze version But for VC plus one, we would like to get rid of the old stuff and just keep the new one the doze stuff So if you're interested in this if you're using it, please advance to doze look at it and talk to me About what you think it okay Now what remains to be done? There are some things that I would like to do some of them are easy some of them are not So it first of all it would be better to it would be nice to have an automatic Classification of the results this would make it much easier for me to file back reports because at the moment I file back reports when I feel like it when I have the time and this is not very often recently so for instance we get quite some Reports about Packages which are craft craft means as they are still in the archive But they are no longer generated from the source package and usually they should be cleaned out by the archive software But sometimes this doesn't happen. So sometimes we have craft which is found to be non installable and even outdated I should find a good way to automatically find packages which just needs to need to be recompiled and Well, this is a bit more difficult. I also would like To have some way To integrate the information about the officially ongoing transitions So I know there is a web page by by the release team I just have to find a way to pass this information and to integrate it into the tool which does the classification of Results. Okay. So this is easy stuff. This is just Something to implement What's more interesting also from an academic point of view would be an Precision of the model which is behind the analysis Because currently we are assuming that all the packages can change in any arbitrary way when they advance to new versions And this is of course not realistic However, the challenge is to find a more realistic model and this model of course could not be something like all for all 15,000 source packages I fill in by hand How these packages can possibly evolve so what's needed here is some way some some generic model Which is realistic and which gives us a better final analysis of how packages may evolve and finally I also have to work on the explanations Which are currently kind of a mess and this is due to the output that we get out of the set solver We have different a lot of different alternatives Which has to be taken into account because we have for every package many different Versions that have to be taken into account and for all possible combinations the such over gives us an explanation This gives me a list like this and currently this is something which I cannot put into a buck report because otherwise You will kill me so I have to write by hand an explanation which summarizes the reason why a package is found to be outdated Okay, thanks for your attention and I think time is almost up But maybe there is some time for questions in fantastic stuff Have you considered doing upgrade ability analysis? What do you mean by that? I mean it seems like your models and much of your software might be able to tell Whether in fact the system with a particular package installed could be Have that package upgraded without needing to de-install it. I noticed that The auditing what without needing to remove it in the meantime and Removing packages in the meantime during an upgrade is undesirable and usually a bug and I know Usually this is the result of bugs in the dependencies We had one in Woody to etch I think it was where you had to actually According to the dependencies you had to remove x during the upgrade, but then you can act would Supply some false options. So you didn't notice this No, we haven't looked at this so far in fact we are In fact, we are not really Well, our analysis is somewhat the cause in the sense that we are not really looking at a scheduling Often often upgrade. So we are just looking at the original situation These packages are installed and at the target situation these packages are installed and we are not looking at the sequence of Insulations that have to be done I was not aware of so maybe we can discuss this offline So I would be very very into the tool to learn more about this. Yeah. Thank you Have you looked at multi arc and changes in the relationships that can happen? Not in the context of this outdated analysis So the depth check to the district tool is now multi-arch aware. So it knows about about Dependencies which can be satisfied across across architecture boundaries This outdated analysis. No, I haven't looked at at multi arc But I do not think well, I do not hope that it would change this thing In an essential way Do you do you think there is a particular problem coming in from from I think from from multi arc? Not that I know. No, okay. Okay. Good Well, thanks Ralph if there is no more questions Thanks for your presentation. Okay. Thank you