 What's the other case? Okay, so let me start. Show the picture of the agenda. We'll give a little bit of a talk. Key talk thing first. We'll tell you what I call you. I'll give a live packaging demo just to put some concrete examples of what I'm going to say. And then I'll turn hopefully to a quick discussion about the various challenges that we have with the hospital, the median, and possible solutions. And I have four or five bullet points for that but we cannot discuss it. We have people watching us from ISE from Germany who got up early or they are late I don't know for that. So I'll be happy if someone could volunteer to relay a question to ISE. I'm a robot. It's human language. Okay, so just to make sure whatever ISE people want to tell them reach us. I'm just on the DevCon. Tell the room channel. The room channel. Also, somebody would like to take a look at the measure that he has for measures afterwards just to use the summary. I have photos who want to stand up and talk into this if you're going to talk for the street. Yeah, that's great. So it's hard. All right, so I need, I guess, one of us. Also, I think that the team members would like to have a second of them all. I'm going to do this afterwards. I'll see you later. So, of course, the medical crew is simple. We maintain packages. We maintain the hospital pilot, CSG. We can basically all pass the libraries that are in DevCon. And that's basically all the main terms of the crew. It's a great challenge. We also have to make sure that he has to grow friends. Somebody that I don't really know, but that's the main focus of both the talk and basically all of our problems. So let's see how we're doing. So I tried to get some statistics. So this was the first table I got about how large our crew is. I was surprised to find that we are maintaining the most packages in DevCon. That is, if you've come all versions that have been there, I think, and at least across the line, the solutions have all instances of binary packages in all categories. So this is basically the naïve result of the universal database. So I tried to fix this. I only looked for unique packages and names. So we got this. This is there. We are now going down to the third place. I'll try to look right. But that's the binary packages and the final product of the right query. So this is the table that reflects reality. So we have the third largest main panel by our package cloud, so it's a package cloud. I guess we have some teams where it's always the group in the main panel field. So they, I think, hire me. So they are the owners. Of course, they're a 5-in-1 package cloud. What do you guys think? Somewhere out there, right? And we're going to catch up with package shell. And the better than packages that are not being completed, there's a good thing, I guess. So how do we do that? Who does it, basically? This is another table from the same source. Uploads in the last six months. And I'm counting some people who may upload by side of it, but rather who appear in the changebook so that not many developers have a chance of appearing on the table, and it reflects more closely who does the work. And you see basically Clinton me and the others. So we'll get back to this table, I guess, in the turn of this. We talk about how healthy our team is and how well we're doing. Most of the work is done with two people. But, on the other hand, we all have at least a number of people who have done something, so we're not going to complicate it all. Which is good. And we have some active newcomers. Next, I think if we put the time frame since he started doing something, he would need for him a lot of fraction of the amount so I'm very happy about that. So it seems that we have few, making a lot of packages. One could say that we are sufficient in the base. I tried to measure that. So it began in the same time frame, last six months, I was comparing a number of those with a number of persons that had this criteria. And by that, we are the most productive and efficient team. So we have 56 downloads per main table in this time frame. And it's the average across all 12. We just have a table that is slightly better than the packaged home and what the QT team does. Yeah, that's far off. Which might be the thing because they have more people. This is not completely qualitative thing to say. So for those who are not already packaging has a lot of special packages compared to others. First of all, they are very homogenous. So there's very good numberation in how the packages look like, where you find them later, how you'll learn. I think it's declarative. They don't require anything. So we have some packages. I'll show one of these packages. So let's use Kabal, which is the upstream package manager to get one of these. And you see there's some text files. And the most important is this Kabal file, which looks a bit like an in-control file, which makes me feel at home. And it has things like license information, version dependencies and contents of the module. And then there's, of course, the source as well. So we come back to that. This is good because it allows us to spend a little time on the individual package because there's less surprise. And then the other thing that also helps a lot is the property of the language itself. The components of the words. So most of the problems that we would face, like incompatibilities between various packages and versions, are handled like a compartment. So I'm reasonably confident by change to packages that are okay if all the packages still compile. We do run some test streets where they are on the packages. So we do run that as well. But, yes, again, it takes away the responsibility to do a comprehensive test after each build. Those are the kind of bugs that we would expect. Cause by mistakes in making are those that compile, that catch and tell you what. So these are the off-sides of packaging. Here's two packages. There are downsides in the range as well. The first one is this metadata that I've just shown you. Yes, let's again. And you see it's relative to detail. It only says what packages this has on, but rather a very precise range of versions that this is not working. And then the main school of how to do this has to be run on its own. So they use it up about if they know something would break up those, but if they don't know, it would not break them. That would not break. If I were to upgrade our dealers to a 0.8, I could not compile this packaging wrong without changing the packaging. This is annoying. But even more annoying is that they don't put it in there. I upgrade to 0.8 and it breaks a third of all and make it able to cap with indicating that it will not break. And we're relying not for correctness or poor efficiency. So we can slow down and the problem for these are actually well conserved. But that's something that's not too commonly seen in other packages. And then the other thing we have basically no ADI stability what this means is I change the dependencies. If I were to change dealers from 0.4 to 0.5 I would have to re-compile this section and update our own dependencies. So because I don't understand the packages, I have to update the package very high when the dependencies read so that was then I have to rebuild 100 or 200 packages. And of course that requires tooling automation. We have to query to a sense that it's not a big deal anymore. It used to be much more difficult when monolith didn't know what insolvability of those dependencies is. And packages in monolith have to be done in all these kind of manual works. So in a sense I think that we can say that the challenges that Haskell has for Debian make Debian improve its infrastructure because it wasn't up to it at this point and then we can improve it and other parts of Debian benefit. So that's why it's stupid to get anyone to spend a billion which I can understand if they would. Okay. So let's have a look at how our work Haskell looks like. And these are roughly the steps I would do when I have to package a new package. So what does this mean? This package plan I want to update in detail but let's have a quick look. It's basically a repository here that has a file called text. The font size in the is always, yeah, that's why I don't use G even when I'm doing presentations. This is last month. Okay. So basically you see a list of packages in version numbers and now I also want to package in version in scene 075. It's not yet in Debian so I will package it. I don't have to sort it. And then I have a little what's a little more program that now verifies this plan of changing the new packages makes sense in terms of all intermediate dependencies between the packages. So if blood were to depend on something that we wouldn't have then yet we would get a complaint in a few seconds. But it says a little bit about that. There's a problem to provide one. Yeah, now I don't want to report so we're fine. And you can see that it tells me that I've added a lot to the plan but I can't so basically that's our tool. So this was step one. Step two is to use the template Debian directory and adjust it. Okay, let's do that. I go to the... Can you explain this a bit? Yeah, I'll come back to that. I have a separate section for that. I just want to do a bit more for us. I guess I should watch some time. So I'll create a directory for Haskell. So this is named by Debian, source package name. So we have a... We have a template directory that we can use that has a change log, a combat file, a control file, a combat file, a rules file, a source directory and a watch file. So I want to use any one of them, one after the other. So the version number is 075 so I put that in the inventory. I don't have to change to, quite often now, to log. So I do that. But I don't bother with ITPs. It's just too much work. And this is me now. That's it for this file. We don't have to change this. And this file is the main work mostly the main work. I need to adjust the dependencies. So we have a template here for bindings that would depend on C-digrees. This is not one of these so we don't need that. I don't need this comment. It's just a reminder. Oh I should have changed this one. And I usually do it first because then I usually use the dot to do the same thing. So I have to type it twice. Well, yeah. So now the dependencies. For these I have to look at this file. I know by heart by now that packages are provided by the compiler itself with space and containers and directory. So I just need to put dependencies on the D-list file part. That's 5 out of 5 by the GCC. So those two dependencies are D-list and transformers. So I will put them in here. And this is how what's being made into sorry then we have package names and I also have to carry over the version page. That's very important. So it's from 0,4 to 4,8 0,8 exclusively put like this. And then the same thing for transformers. And this is from 0,2 to 0,6. And then we have the description. We have a little feature here that we have the description once in the source. It stands out because we reuse it with different types of packages. And here this actually doesn't have to change. So anything coming down here is identical for our packages. When we are lucky we can take the description from here. It's a bit short but I guess it's for the packages alright. So I will put this in the sort description. The long description is very pretty but it has to go this package contains 8,8 for clock. Because you know the policy that the long description has to be usable in the pattern of the short one. So this is I guess a little bit otherwise we wouldn't package that much. Now this is a line part, copyright clause. I find that we spend too much time on them for two little values. So that should be I would think we could use some little ease in what we expect to have here. So for the licensing we just copy there. In this case a bit more complicated because they didn't put the copyright in the license file so I have to copyright in this other file they have. That's a bit unusual. Credits. There are two people that I prefer. I have to put them here. I don't know who benefits from me putting this information there. Seriously. But that's okay. I can live with in that we have to live with things that you don't find. It's a ton of fully agreeable. And it's already 24 by now. And then I have to indent this. So this works using this little window line. And there are no tests in this file so I won't even bother with any agreement here. And we also have to change the watch file. That's it. So let's put this into a DAX repository. And we can build. So to build it I have some automation that helps me managing several packages. So I take this for I take this for years and I can take it at the same time at the same time. And then I can mask build it. It's a bit much to say mask build just for a single packages but we can do it. So this will now take this data from the directory. Oh, yeah. Glock is a trick thing. We might remember that the upstream entered capital G. So in this case the watch file has to refer to a capital G on a regular so it's a meant and try to build it. Okay. I made a build and I made a build. So while this is building here's what we've done on this. Notice that we will say it's not quite as good. So there's very little entropy in all control parts and not any characteristics at all. So that's a big plus. I guess you could even do this as well. We did that on this panel I have to take and push attack. And there's also scripts available in our tool supply to do this a bit more efficiently. So this is the building. So let's have a look in parallel what happens if you want to upgrade your package. It's similar. So again we start by updating the package plan. And there's a bit more interesting this time. So there's our package plan. So these are the upgrades that we need for anyways. I want to upgrade the mass functions from the latest version. This is upstream. This is actually going to get our source from. Notice a nice feature here that it tells us, or it has the use of this package for all the members of the region in this version. We can see that this is on the next version. So we want to go to the next version and so let's try to upgrade mass functions to this version. And I run this test packages script and it will not verify that we can upgrade our safety to this version without breaking it. At least that's how it will work. Metabake. And it will tell us that we can, but that we have holding an additional dependency. So there's a new package that this new version depends on it, therefore we also need nothing. So I'll add this to the file and sort. And now it's in control. So let's go we successfully go package log and we will open the upload together with the other packages at the moment. So let's see what we have to do yet. Update the package plan in the new version. We fix and upgrade it, so that's the second start. Package plan in the new system. Now we actually have the update of the packaging. So for that mass functions I will script that it's not in the part anymore. Yeah, there you go. Generates I'm sorry, what I'm saying is something else, a very good thing, import the customaries. So now what this means is I can construct a URL that gives me this between other three versions on package. So I can do the package review without even downloading stuff. And in the script I already figured out what is my version and what's the latest version. So I just ran this command. It's always my history. And I can now review the changes. Sometimes they have changelogs. Unfortunately, very, very rarely, changelogs seems to be uncommon in Haskell World. Not nice. Yeah, I can look. Looks like they are the tests, which is nice. And of course, I'm mostly interested in the changes to the metadata. Because this will tell me how the the dependencies have changed. For example, we see that there's a new one. This means I will have this available when I now do the changes here. So I do mass upgrade which basically just figured out the version number. So I don't have to type it. And that's the table again. And then I modify the control file. This is not up to date anymore. We don't need this anymore. And we know that it has changed um... Oh yeah, this is a new dependency of course. It would be strange for that to not know. So we also want to depend on vector th unbox in an arbitrary version. Okay, we've upstreamed things. That's fine and it's hoped that he's right. And this is strangely formatted. Let's fix this. And we also want to have a different notation package. The version eventually can change, but we also have a new dependency on deepsec, which is another small, many of these small packages. So we also put this in here. Sorry. This should be enough to do this upgrade. You're missing a comment after crof. Oh yeah, that's fine. That was all wrong. That was giving me a annoying error message later, so thank you. Ah yeah. I think let's put it in here so it doesn't have next time. This is also the value, so this is just more consistent. Then I amend the upstream release that my script has created and I take this for release. Now if I now say mass build has come out functions it will tell me after creating a source package that it can't build it. Because obviously it had a new dependency, we haven't packaged it yet. And this is using either a set check or a dosage check, we say. It will also detect that the other parameter I gave you might as well as this has to go up. It's already built. So it says I don't need to build this package. Yeah. Now I've prepared the other one. There has to go back to a th-unbox. So I just mass-release it, which takes it for release. And add this to this command line and here in action this works well for like 10s and dozens of packages that I have to upgrade simultaneously. Okay. Now this should now build then rewrite so that they can be built with each other. This is using F-Fill in case you're interested. Or just very small package scripts that happen to be very useful. So I'm going to try to have a look at this for a moment. And later, because remember the API in civility, I'll have to check if any package has become available and schedule any use for that. Now since two days we have a new tool for that. And it looks like this. It's a crumb-slot that generates a file like this. And it will go through all the packages, see what needs to be done, and give me a soon-to-be report. Currently everything is counted out because we don't have anything to do right now. If we had something, we would appear there. And I have this title to want to build for those who don't want to build this. For those who don't want to. Sorry. It also needs to have an ensemble in there here. You can look at this page. And if there are things to do, you can not need to apply them. It's a bit transparent now than it used to be. And you can also see what is failed by criteria. Criterion fails a little from source because this metadata was not precise. And the uploads on the graph now will help us towards fixing this. And I'm going to fix it a few days earlier, as it wouldn't have wanted the uploads to be now this administration. So this is step six. Let's see how our uploads are going. Okay, we are building already math functions. And it's built now. And I think that should be done. So yeah, it keeps upgrading the change route because I built new packages. And we are doing okay. So let's talk about this package then. I think this is one of the most unique things that we do, compared to other people in the end. So this might be interesting for maintenance as well. So we've seen that it contains basically a list of packages. And there's a bit more. Sometimes there are attributes that we use, such as a binary for example. This means it doesn't use as a library. So it needs different ways to figure out a version in the end. It also needs to find out a package name. Is that a question? Yes. So the principles would like to know why you added deepsec. Okay, I guess I could just simply do this. Sorry about that. There we go. Why did you put deepsec in the build depends on what it's provided by the GSE. I thought it wasn't. Maybe I think it has changed sometimes. Okay, sorry, my bad. It used to be a type of package, but no. It doesn't happen in this case because it has a little bit of a right thing. If it were a version dependency, we would give the build dependence unsoldable and have a notice like we fix it. So it kind of just doesn't matter. But otherwise, you would notice a breakage. But I think it's critical. But well spoken. Okay, successfully built and now what I do now is mass upload. So upload the changes, push the text, push the changes, make sure I take it. People sometimes forget to take things and it's confusing problem developers. Yeah. Okay, I forgot to actually push the whole story first to any of it's Well, just before the build. Well, as I said, it's pretty close. Okay, so usually this would work, but it's just continuous. But I want to give you a question of how I'm managing these packages. It doesn't have to be the best way, but it maybe explains how we can manage 500 packages in most of the people. And I guess Clint might be using some other tools. I don't know. But we can talk about it also later. Okay, so back to the packages. So we have these meta data now. We also have actually built modules. For example, no test means that we don't want to test it because maybe it's dependency is not something you want to run or test with a stupid or something else. We can add flags, which are like configure flags. Here it will apply them correctly. And then there are things like Econore that will make us temporarily Econore package. Then what happens then is that Econore package will read the regular packages from Debian and will read the package from Cabal, from Hackage, to compare to the list and report anything that's out of stage, for example, where Debian is in version new than in the package plan, which is not allowed by our policy. And then it will make sure that they're consistent. And this check is being done by the actual Cabal install program that everybody else uses to install construct the command line that looks like this, which basically installs all the packages with a white flag in the white version and sometimes disables tests or anything then. And it's quite a long command line. And I guess I should add to this it's 63 kilobytes command line but it works it works fine the problem is you can't debug it by looking at PROC TIT command line because that's limited to one kilobyte that's why I've provided to debug it but yeah, it works it's a pretty tiny detail but it also allows us to patch things that we have a bit of a copy of all our patches and these nodes that affect the metadata then this tree and you can see that it's a you guys need to copy all of the patches because it's a quill compatible format. So yeah, for there being patches and yeah for quills whatever the packages make sure that they end up with a nice exteriors I can also add cabal files there that are not upstream or this way that means we should fork so this package is not in there yet Lens is a package that is a package but has been modified in the metadata by Hackage so that the cabal file that Hackage reports differs from the cabal file in the trouble. They do it so they can update the metadata, it's a new feature it's great, just annoying for us so whenever it happens you have to patch the package the package plan yeah, we've seen a test there's also run by Jenkins so we have a set of Jenkins jobs one of them is the package plan and we can see that it's currently succeeding, yay and if not make it on the mail so let's make sure that the invariant is preserved I'll go quickly to that, we have a few more tools we have a package entropy and package entropy tracker it's something that the package probe team had first I think and it gives an overview of the relationship status of the packages in the BCIs about upstream and about possible changes it needs stuff I think it has been running unmodified for years and I'm lucky it's running at all I think somebody should take care of it I also have the impression that the PET team itself is kind of dormant, so it's a pity it's a useful tool so if somebody feels working on this that would be great then I've already talked about building scheduling I've just shown you the package job there are third jobs that install all package packages that tries to install packages in unstable for a three month of performance for various varying reasons other than the possible package that had to be removed, the possible move back sometimes it was something that we didn't fix in time so this is a bit unfortunate but at least currently the only other source is the criterion error that's not a correct could have fixed a few things so maybe it could have been a lot the others are usually created in unstable, upgrade, testing and these are not useful these are experiments some of our packages have auto package tests last talk gave auto packages and you've already seen that package knows about all versions there's a script I run on people that go in a crunch up and run over us somebody will have to figure out how to do that a typical living problem I guess you could document this somewhere document this somewhere just how to find it all right so that took me a little bit longer than expected but I hope it's all good inside and that's why I'm applying for discussion so these are some of the challenges that I guess none of us will end that I think we could discuss is the question clear? can you just sort of remind me let's take a just if you want to there's also a script on the transition oh yeah sorry you can see the installable packages it's completed so this one is the very first one I think you just have to take the metric from the network and you have the first one which will be which is here yeah thanks a lot for that I just forgot about it I am using this okay so yeah please we just need to be better telling you how it can be improved it's not the perfect thing yet for us but I guess that's a good one there so I'll make sure I use it more thanks a lot for that okay so the first one is basically watch the pickers oh yeah just a question about library versions you seem to have pretty much the latest version do you find that works? enough people who are writing CASCOR programs make sure their programs work because in Java we have from the package every dependency we did 16 versions of the same library it's not that I need to mention something so we have this policy of having one version of each package it works okayish we will see in a moment where it doesn't work very well um yeah we sometimes can't upgrade to make this version of something because it requires the older version because it requires the older version but there's movement and it has to be to always be compatible with the latest version there are tools that check it there like construction integration tools for that that people can opt into we can maybe push that part of the upstream and encourage them to use these tools we rarely have the situation where you actually need package 2 versions of a library yet last I don't know if it was actually required to just pre-emptive problem-rewarding yeah but usually it works okay so this I think on the bottom point we've discussed is what should we package at all and related to that what are our users and I can see different distributions with very different set of packages the Dora is fewer but has some that we don't have and all the way around this is proportional next so that's the package in a lot much more than video but there's a little package that we don't have so there's actually a lot of arbitrariness there so what should we package obviously we need more dependency in the programs in that case so Joey decides that genetics needs library 2 that's a very compelling reason to package 2 that's maybe the only reason I can think about it's the hard reason then maybe we want to package very common packages people many people want to use seems to be reasonable but that's people of mine maybe we also should provide packages that are hard to compile a user of because they depend on C libraries and stuff which Cabal has not installed by itself so it's confusing for me to get to this maybe we should just continue packaging what we like to package and what we like to use arbitrariness is not a problem and maybe there are other reasons and related to that is basically the question who are our users if there are any what kind of who would use has the package from them or what has the package from using Cabal install and are we doing the best for them are we doing good stuff for a hypothetical only user that doesn't exist maybe outside of our own group so yeah that's all the question so a question from I can see was the relationship of using stackage and how that fits into packaging here with right, stackage is a process that's mentioned where people get notified of breakage and the people who sign up for stackage are encouraged to use stackage one would be to say that we want to package or stackage or we want to only package stuff with one stackage but I guess correctly speaking we would be restricting ourselves to other people's decision we should definitely encourage people to use stackage developers to use stackage taking away problems or certain problems and detecting problems before they get us and it would be the first to notice the answer to the question yeah I think the only other question the related question is whether there are things stackage does and that part how does it help us how can we make it help the job for companies here so basically being successful I don't think stackage is going to do anything differently but it should be more prominent and more visible in the community so what if I just use stackage or I'm on stackage then it would be better I don't understand anything right now that I want from stackage to be better to answer that question I would really package publish the statistics so you package them all more here than here good idea thought about that since a while so we can download here in total in the last three days so we guess we could there's a lot of work for this now with the packages tomorrow so Joe is saying that the webflow is a lot but I guess you should get the relations of it if you think but I'd like to get picked on this point who are our users what is who are we doing this work for besides the dependencies I think that's part of the question so generally in general where we have packaging things that I don't use myself in Debian so maybe it's your favorite library or dependencies are things that I need that we have because I need it for something that I use myself otherwise I find that I'm generally not the best container of these packages so we don't have too much of a classroom this goes into this next slide about upgrades I guess let's put a really cool point of view on the table what if these only package dependencies everything needed to get quality and maybe nothing and only things that are educational that are required for people that are getting started with package they don't want to worry about the violence but yet like like Gloss for example would be an example it's a graphic library meant for teaching and nothing else and then have people use the violence but would it be improved would it be giving away unless there's people in it yes that's a good point let's try to make sure so when I started out writing the bags and I didn't really know much about Pascal either I had the problem of while I'm targeting Debian I'm targeting something that I want to be in distributions but I'm finding all these libraries don't package that aren't in Debian so maybe I'll check that out there that's true maybe should be I guess we believe we are I'm not sure if we really are creating a set of packages that that is like supposed to be used so it can be totally a selection and people can hopefully rely on packages in Debian a bit more stable a bit more usable than packages not in Debian I don't think we actually even thought of it at the moment I think part of it too is that you're putting down history of Debian Haskell packaging which was more or less unusable a few years ago and I think everybody can have it of doing everything out of football but I really am a big fan of the packaging and the set of things I can you can do to foodstrap for a couple of reasons but obviously putting in some of that effort into all this even with all your automation tools and everything else I would love to see your time doing something other than actually stuff that we don't really care about but also because I picked up that set of habits over a few years ago now I don't know about football there's less and less and less of the different packages that I thought would be installed and so and that has its own set of problems I certainly get that there's a complex set of questions here doesn't like something that's super easy to answer but I would say less is maybe more for the end of the season for the most nice students I think they're the same boys also I would probably look at the mic thank you very much but I think this is a lot of work I really appreciate it personally I don't want to install anything but football so I I could show everything I would I would hope that other people would do that too I have to go but I wanted to suggest that somebody take a look at because it fails to build an alternative platform so I would love to re-ask sorry I guess the point was to this question of it's not open so I just took a question so how does it look with stable so basically you're going to we're going to take a cut slice a tongue so is that good for the Haskell community or not maybe that's an argument for just packaging the core set or does our new Haskell developers just need track platform testing I think Haskell on stable is not very useful for maybe again we have only one minute left so I really would like the video scene to stay because the people are as even but I guess that's too bad we'll cut it off here five minutes great, five minutes let's see how often we can do this okay, yeah very quickly we had this argument about Java earlier in the week about should we just package the compiler over the run time and that's about it as a user and our feeling then was that maybe this wasn't the best for our users it's just to give them a minimum bare minimum that needs to bootstrap themselves but our feeling was perhaps this sort of moved towards the distribution just down the river but we're doing okay if Clint would stop working it if I would stop working it we would probably do less good maybe if I stop working it and we get to switch to git people would, I don't know if that makes a difference maybe we can discuss that also in a whole way whether we actually need to actively get more people to contribute and they can make it easier this is related I agree that git has won it's just that I'm used to this I won't get an extension this time so I'll just get out what I want to say it's minor things I guess if somebody would stand up and do all the conversion to git and update my scripts I wouldn't be in the way but as long as nobody's doing the work I don't care about enough I'll keep using that I guess at least for a while but it's not fundamental opposition it's just somebody has to do it we could do it better on architectures so maybe that improves and Haskell becomes more visible and more important with more products in Haskell shipping in Debbie and maybe then porters will get more interested in helping us there I'm not doing as good as I want to Colin please but I guess that's a long minute thank you as one of the few porters who's bothered it's a very, very high barrier to entry for porters not saying they shouldn't but we probably shouldn't expect people to turn up and want to fix GHC unless they really enjoy stringent intellectual puzzles so I mean I'm all for that but let's not have expectations too high there because it is very, very difficult right so let's somebody has to do it okay I think we have to thank the video team for giving us the extension and I think I'll finish the official part here and we can just keep sitting and talking thank you goodbye Sven