 Well, again, hello to everybody who's come to this track. I'm Matthew Johnson, and Niles here has been one of the people who's been trying to push through various changes to our sub-policy document. It has unfortunately stagnated for quite a long time, so until recently if you went and had a look at the Java policy it didn't bear very much relation to what the current practices were. But we've been working to try and bring these two together. So I'm just going to go over a quick summary of what changes we've made and what the current state of policy for packaging Debian libraries and applications is. And then I want to have a discussion about where we should go from here, because there's a number of issues I think we should answer in policy to try and improve how Java package is working in Debian. So what changes have we made so far? So we are now down to only three VMs in Debian. We have the latest non-free Sun JDK non-free, and we have the latest Open JDK in main, and we also have GCJ. What are the benefits of the Sun non-free JDK? Is it historical? So there are, Open JDK is very nearly the same as the Sun JDK. Unfortunately when they came to release it there are a number of parts of the system that they didn't actually have the rights to release under an open source licence. So those parts they've been trying to implement. So there are a small number of things which are missing, and there are a small number of things that don't work quite as well. Ok, so for example the fund management is not working very well with Open JDK, and the swing interface is slower also. I've got some bug reports on this. So unfortunately for the moment we do need to keep the Sun JDK around in non-free, but with luck as time goes on Open JDK will begin to approach. Sorry, just a quick comment on that. I would hope we aren't too quick to remove the Sun JDK from non-free just because there are a lot of business users that are running, and they're not going to be as quick to migrate to Open JDK until they have a level of trust. Well I don't know how many of those are using the version package in non-free. I think it's a big convenience. I know folks who do it. And as I mentioned in the previous session there are some problems with Open JDK and some of our architectures, so unfortunately it looks like we're going to keep those. But we have managed to get rid of all of the other JVMs we used to have. So there's a lot smaller range of things we need to make things work with. We've also removed the Java virtual machine virtual package, and whilst it used to be the case that a lot of packages would ship dash GCJ packages containing the Java code compiled to native packages, that's now deprecated unless you have a very, very good reason for it, which needs to be discussed in advance. Meta packages, we now have these default set of packages, default JDK, jerry and doc, and on most platforms this points to Open JDK except on the three which don't support it where it points to GCJ. And for the large part you will just use Open JDK because your package, unless it does something particularly strange, will work with any of the options. What happened with the headless? Sorry, there are also headless variations of these so if you have a package which doesn't use GUI you can depend on it and with any luck you'll have a slightly smaller dependency tree. We've also removed Java package because now we have Open JDK main and the latest Sun JDK in non-free, that should be sufficient, and we expect one of those to work. To repeat one question for this, when we don't have Java package anymore to build some custom lips, what about security updates for the non-free Sun Java over the cause of some debris? I don't know who's maintainer for that. Yes, we are currently doing stable updates for any of the other security updates needed so we currently have 6.20 and 6.21 didn't have any security issue fixed. We've also removed default JDK build depth because it's a very confusing name. It was only there if you wanted to build a GCJ native package so we now have GCJ native helper instead. Other various changes that we've made, we now want all libraries to provide Java doc and this Java doc should use dash link and then recommend all of the libraries they use so that you get links between all the Java doc on the system. And if you install with recommends, which is the default, those links will work. Lastly, one of the things which was the very first part of Java helper, which I've loaded with Jarrapper, which in a lot of cases you have a jar which you can run with Java dash of jar and it will run your application. If that's the case, then you don't need to ship a wrapper package anymore, a wrapper script, you can just set the jar executable. And then if you depend on Jarrapper, it uses binform.mysg and it will just execute your application directly. Did you get in touch with the Llytian maintainer to handle such a check? I believe it in the audience. Because it would be nice to have this in Llytian. I think I filed a bug relating to that. There's an open bug. Basically the only work that needs to be done is somebody needs to figure out the right way of recognizing the executable jar files and patch that into the code which complains about unrecognized executables so that it suppresses it for that. And I just haven't had a chance to go look at the magic and figure out what the right thing to do there was. So if somebody wants to come up with a patch, it shouldn't be that hard and happy to apply it. We probably also need to align to check the checks. If you do have one of these, either you are depending on Jarrapper. But I think I mentioned that as part of that. So now I just want to go over what the current state's policy is. So if you are somebody who's trying to package Java software in Debian, this should tell you what it is you need to do. The policy by the way is now up to date on the website with the current state. Add is in, I believe, the Java common package as well. So general, independent of what sort of Java package it is, you must build with a specific JDK. So your setting of your alternatives on your system should not affect how the package is built. Is there a question? I'm building a package that has a custom JNI library. Where should I be installing that? I'll get on to that on the next slide. This is just an extension of normal Debian policy. Everything that you build and install as part of your package must be built by source during the build process of all of the Jars, class files and Java Dots, even where your upstream ships these with the package, you really do have to rebuild them. So Java programs, the goal here is that if a user comes along and wants to install a program, they don't need to know that it's implemented in Java. App gets installed, program name should work. So the package shouldn't be a dash Java package if it's a program. And they need to have either an executable Java or a wrapper, and now anywhere in the accepted path it's fine. And they must make sure that they deal with any environment variables or arguments to the JVM. Users shouldn't need to know with an application what a class path is or any of this. So it's up to the wrapper script or manifest options in the case of an executable jar to get this right. We have a location for any Jars which are installed, which depends on whether you expect this to be used from another package or not. There are some programs where part of what they install can be used by other programs, in that case user share Java, but otherwise they should just live in the package specific share directory. I don't think it needs to be in there. I think if it's only a jar which is used as part of the program, it's not used by anybody else, then user share package is the right location for that. Do we have the mic for that? I think the user share Java package would be useful for libraries that are implemented multiple times. Like some stuff is implemented by the Apache Software Foundation or by Sun. They're implementing the same API in different ways, so it would make sense to have that. Apparently you're comfortable taking questions as you go. Is there any guidance for writing these wrappers in particular when you're supporting multiple JVMs and JVM versions? For example, one of the things we tried to do at Sun is reduce the need for command line tuning of the VM, but in practice often some command line tuning is fairly helpful. Now you end up with a non-trivial wrapper, and I don't know if you've looked at the way Gen2 handles that, or if there's any other guidance for how to do it. I believe there are two things currently packaged for helping you with wrapper scripts. One of which, as I mentioned, is jar wrapper, where if you have a static set of options that you need to pass, you can actually put these in the jar manifest file, and jar wrapper will unpack these and use them as arguments to the JVM. There's also, I believe, another package for generating wrapper scripts. Jar wrapper. Jar wrapper. Jar wrapper. It is for running. Jar wrapper is more for building. I don't know whether that has a complex function. I'm using it for shell software. Jar software is working pretty fine. You need to manage such issues. Okay. Great. What about... I know you have a slide on Java native interface, but with respect to executing, do you allow LD library pass to say it's going to... Ideally, all of the JVMs that we ship will know to look in the user.JNI directory and have that on their library path already. I don't know if that's the case, but that's something I think we should look to do. It will all be there implicitly. If you do need an LD library path to run, then your script must set that. We don't forbid the wrapper to set that. No. They may well need to depending on how they work. Lastly, a programme must depend on a JVM. It has to... The first option, as with all alternate, has to be a real package. If you can run with any of the VMs that we package, then default jar is the correct one of those to use, as it will install the platform default for your architecture. Then you must have an alternate that depends on Java X runtime for some version of X, which is the lowest version of the runtime that you'll run with. Mainly, that will be determined by what class file version you build with. But if there are specific things in newer releases of the runtime that you need, then you need to set a higher version for that. Libraries. Currently, the policy says that they should be lib some name hyphen Java, although renaming that to JVM is entirely possible if we think that's a suitable way to go. Currently, in user shared Java, you'll find that actual jars are installed as jarname-version.jar, and then there's a symlink from jarname.jar to the version of the symlink. Libraries should build Java doc, and it should link and recommend any dependencies Java doc that you have. Finally, if you do have any JNI, this should be the bi-separate binary hyphen JNI package. What I haven't actually got up here but is in policy is that all of your libraries live in users live JNI as the package named ISO. No version numbers? With version numbers, if appropriate. Your Java doc should be in a separate doc package because Java doc typically will be larger than the rest of your package and installing it just for users who need to run the library. That's a good requirement. I mean, I have a JNI that has three functions in it, and I have to have two packages. Well, you have to have two. You certainly have to have an architecture-specific package because your JNI will be different on each architecture. If you have sufficiently small, and the size here is actually more dependent on the Java side than the JNI side, a small amount of Java in your library might be acceptable to have a single package which is architecture-specific. But if you have a lot of architecture-independent Java that you're putting in this package, then that will increase the size on all of the mirrors because all of them have to have it. So the recommendation is always to split the packages up. There were some questions on the personal feather about saw names and stuff like that. So here we are still using live name without a number on the name. So we don't want to change that. The end of the talk I'm going to go on to what sort of things we want to change. This is just what the current status of policy is. And in fact, here we go. So that's essentially where we are at the moment. There are a number of things which certainly I would like to change and things which other people would. So transitions. This is one of the things where Java does not have a good story at the moment for handling library transitions either of JNI or just of Java libraries. And a lot of this is not helped by upstream practices where they will assume that they can just change where they like and when people use their libraries they will tend to just take a snapshot of it bundle it in their jar and forget about it so they don't have any problems with upgrades. So the first problem here is that at the moment in a lot of cases you need to know about all of your recursive dependencies. So if I have a library that uses another Java library anybody using me needs to know about that because they need to set it on their class path. And that just means that whenever a library makes a transition that changes one of its dependencies, particularly adding it then all of the, our dependency of that library, are going to have to change. They're going to have to make source changes. You can't just rebuild them. And actually it turns out that JVM has a solution for this already. If your jar contains a class path in its manifest then the JVM will, when it loads your jar load everything in your manifest class path as well. But very few upstreams actually use this. A possible alternative solution which I think we're going to hear about later is the jigsaw stuff which may give us a better and more robust solution for this. But I definitely think we need this. At the moment we're in the case that C was a long time ago where all of your dependencies everything had to be worked out manually and I don't think that's a tenable solution going forwards. So second point on here at the moment we have these jars with version numbers in them and simlinks which don't have version numbers in them but it's unclear what this is actually useful for particularly since they tend to be in the same package. So there's no way that you can have two jars with different version numbers installed from the relevant two different packages because at least the version of simlink is going to clash. But can we actually change this to make them useful? So the analog with C is the surname of a package and you'll have an actual file which contains the surname and then you'll have a simlink which doesn't so that you build against the simlink but then at runtime you always run against the version that you built against and if you have multiple ones installed during a transition period you can still do this. Problem as I mentioned is that we don't have header files we don't have any of the things that are normally shipped in a dash dev package so it might just contain a single simlink which is possibly a little wasteful. And if we were doing something like that and possibly anyway should we mandate that you put a dependency when you depend on a Java library you depend on at least the version you built against. This means that as long as your upstream is only making backwards compatible changes then you can be sure that the dependency ensures that the version you built against which you checked worked is the one that the package will be installed or a more recent one which also works. So I think I have some other slides on this but I don't know if anybody has any opinions on any of those issues. Do you have any way of avoiding that? Yeah. And as was mentioned earlier with having source jars available perhaps that's what's in a dev package you get a simlink and you get source jars. Perhaps at least the documentation I don't know about the source jars. Well at the moment we have dot packages for that maybe that should be combined so that you always install the documentation but then if you're the dev package will need the dev package will need to be installed when you're building auto building the package but you don't want the documentation so. It seems like the only files you should install any of packages of those required for compilation and that may be just the simlink. I don't know whether we want to make a practice of having hundreds of packages which only contain a single file. You're all, I mean the alternative is having something in the regular package that conditionally installs the simlink if you're the newest one so maybe a helper like the LDNSO configuration stuff that finds the latest version and creates the appropriate simlink. But then you can't force an older version of the library to be your development version. I would rather have a dev package because then you can say I want this, I want to develop against this version of the library. That's how you build a jar that would build against an older version and also run against newer versions which you might want to do. I don't know if that addresses the backwards compatibility. I thought about that too. I don't know if that addresses the backwards compatibility. I thought about that too. I don't know, it's a tricky thing. I suppose one out one, would it be possible to do what Keith was saying in the post-ins to just update the simlink to the latest version? Would that work? I don't think we want that. I think explicitly, we almost explicitly want to depend against the oldest version in the system most of the time because if you're trying to build a package that will run against multiple versions of a library and you assume forward compatibility to compile against the oldest version of the library. If we're assuming forward compatibility, then I don't think you necessarily want parallel install of things which are forward compatible. I think it's only the point where people... No, you want to compile against the oldest version so that your package will then run against the oldest version. Normally this is used, this sort of trick is used with other languages where you're explicitly doing a transition to upgrade to a newer version and so what you want is for everything, gradually as it gets rebuilt to use the newer version of the archive. And if people are doing essentially back ports, then they do that themselves on an older version. My understanding is that Java generally isn't going to... If you build against the new version of the jar, you're generally not going to break compatibility with the old version of the jar unless you're explicitly using the new APIs. In other words, it doesn't matter. I don't believe as much with Java which version of the jar is installed. The ability is to build against the older version. With Java essentially, the API is almost the same as the API. So as long as you're just adding things, then you can use a newer version of the library because the library just adds things and doesn't change anything. But if you use a particular version, there's no guarantee that the thing you're using in that version wasn't added from a previous version so you can't necessarily use the previous one. It's actually the problem that for C libraries, symbols files and so on. So if there's something equivalent in Java, then you can build the dependencies automatically and it figures out from the fact that you're using that API that you need to depend on that version of the library and then it becomes fairly straightforward. There has been a suggestion about producing symbols files in a similar fashion. I'm interested in the fact that you've actually done this. A worry when we're talking about this last night is that they might become very large because typically with Java programs, they aren't very careful about keeping things non-public if they're actually private API and so your symbols files might contain a whole bunch of things which are actually private API. One more thing also about the Java world. Assuming that the upstream developer knows what is an API and an API, and usually it's not the case. I don't worry about the API because usually, in most of the cases, when you are building a huge software with Java libraries, you are including them directly and you are basically freezing the version. So you don't really care about the upgrade of the API in the next release. So the Java world is a bit different from the software development perspective from the C1. The difficulty is at the point at which you're really completely breaking the API, it's not clear that an unversion sim link is going to allow you a whole lot one way or the other. The idea of the unversion sim link is that you can go against whichever the latest version is, but if the API is changing completely, at that point, the unversion sim link is not actually helping the world. I think it might be beneficial to move to the model where we have backwards incompatible API changes like surnames built into as what's the version name. So and drop actually having versions of the package version in the file names. I don't see much use for it. In general, I don't see a whole lot of difference between setting up the sim link in the post install and just including it in the package. If it needs to be handled separately, I think it's got to be something more complicated has got to be done with it. The question is whether you can have a... At the moment we have things with things like where you have what two versions of the package in the archive, but those can't both install the same unversion sim link. Oh, so just go in the post install so you don't always install it. It already exists, then you don't install it. Well, possibly if it already exists, you replace it. Right, okay, so they don't conflict with you either. The thing is though, is that's semantics, wherein you create the sim link if it doesn't already exist, and if it does exist, you change the sim link if and only if your version is later than the other version. It's pretty much exactly the semantics that alternatives gives you. I don't know how much people would object to the alternatives being full of Java files. That might be a really interesting mailing list discussion to have crossed between the Debian Java list and the Debian Policy list. Because the the deep package maintainers all follow the Debian Policy list and are likely to weigh in on that kind of a discussion. Just to throw something in at that person I don't like. What about OSGI bundles? Sorry? Moving everything to OSGI bundles. So that's something which I've been wondering about whether we should use OSGI as how we deal with versions and transitions and recursive fast path loading and so on. What I'm actually interested to know is whether Jigsaw is actually a thing we should be moving to because it's built into the JVM rather than being Well, I'll talk about that more later. But the argument from Sun is that well, obviously this has been kind of a political thing too, but the argument has been OSGI is outside of lives outside of the Java world. The JVM is not really aware of it and that that isn't sufficient that you actually want the compiler and the VM in the class order to be module aware and that's what Jigsaw is doing is it's bringing that module awareness into the VM to the point that you get rid of the class path. And I think that that's the sort of direction we should be moving in because it solves a lot of these problems. Well, the reason I don't like OSGI is because it gets super sloppy and then they end up shipping 20 versions of the same jar with an application. Sure, but I mean that's one of the things we can try and fix up and help tools are useful for this. Another thing that I like about Jigsaw is besides solving kind of this version thing there's a performance boost. I mean that's kind of the big purpose for reducing the number of files that need to be sucked up off the disk. So if you've got something that has that's a little bit more interactive and you hope to have like for servers okay fine it doesn't matter but if you've got some sort of command line something or other that's built in Java it's the I.O. that kills you. Well, one thing that I think is really what this versioning of jar's issue gets to is over time the maintainability of an archive of Java applications and libraries that all of which have been moving and how you know is there a way that we can not have to tweak each and every art depends every time a library changes I think that's and I don't know if there's a silver bullet for that. Well, one of the things at the moment it's not possible to do bin NMUs for archive packages but actually I think at the point we have helper tools that also generate a lot of this stuff at build time for you then actually a bin NMU of an archive package starts to make and I have actually talked about this to FTP master and they generally seem happy to do the work to allow this to be possible and then hopefully I'm going to talk about packaging tools after lunch in the Java help talk but if we have packaging tools that sort out all the dependencies inferring this from at the moment it has to be class path information but once you've got jigsaw stuff or OSJR whatever this can all you work out for you by the packaging helpers and then if you do have a transition hopefully it's just a case of rebuilding So in that world you're saying you wouldn't have the sim link you would have explicit versioning and not try to assume backwards compatibility in versioning What I've been thinking of is that we have the sim link but perhaps when you build the package and turn your it will make all of your jigsaw class path or whatever dependencies be on the versioned one and that version only changes when you actually get maybe I change so you end up with this essentially when if you do and then the version would be in your dependencies of version on the package and all this gets built for you by your packaging helper so if a library gets upgraded, if there's no ABI change it should just work and if there is an ABI change then if a package works with the ABI you can just rebuild it and if it doesn't work with the ABI then you need to make actual source changes but you're going to need to do that anyway if it's broken what you use in the ABI Somebody on the version mentioned on the IRC that one of the problems is that Java developers don't care so much about ABI changes so maybe something else we can do is trying to convince them that they should start more behaving like the C library counterparts and taking their ABI sell a little more than just whatever I put on my public. If we can try and get this so the upstreams actually understand where we're coming from and why having stable ABIs knowing when they break them is good for them and for everybody else who's trying to use their stuff then that's really good and I think that would improve matters a lot I don't know whether the advantage of jigsaw is likely to try and convince people that they need to provide this information as upstream or not I don't want to jump ahead but I think that's going to be a tough sell but I mean the people we talked to at Faustem seem to you know some of them everybody had reasons for what they were doing but you know I think a lot of them appreciated understanding more what was going on there from our point of view so that's transitions I think that's one of the big things it would be nice to solve I mean we're obviously not going to make any changes for squeeze at this point but if we can get it to a point where we have a good story for how transitions have happened in the long run it's going to make all our lives a lot easier because we aren't going to be doing make work whenever a library changes and we need to make source changes all over the place so a few other bits and pieces that have been mentioned or have occurred to me first of all is class file versioning should we require that people build with Java 1.5 class files where possible or whatever the language is or at least the lowest common denominator of what we have in the archive and do we need to do anything to support multiarch? I mean we definitely need to do a few things to support multiarch but we should start thinking about those because I believe multiarch in at least some form is going to be ready for squeeze so the first thing I think we need to do is the VMs need to be declared multiarch capable so that you can have a particular you can have VMs for your particular architecture I don't think any of the architectural packages needs to know about multiarch but we should probably sort out the JVMs sooner rather than later and yeah I'm out of policy related things that I wanted to talk about but if people have things that they would like to talk about then you know please do chime in at this point or just my question is what's the motivation for the class file versions I mean shouldn't it be dictated by the minimum version of Java that is needed to actually compile all that although there's two issues there's the source and the target version so you can have things where you can build a source that requires Java 1.6 but it can build classes that will work with 1.5 if we don't manage anything in policy then people will get whatever the default is and that's fine and in a lot of cases that will just be the most recent one that they built with if there was some suggesting it might be good for us to say that we'd like if you can to use an older class file version because that'll make it more compatible with other VMs if people want to use them or whatever Just a comment I mean nothing's supported anymore except for Java 1.6 unless you pay for it I don't work for it anymore well that wasn't a barb you were looking at me though for my day job I happen to know that if a company has extended support then 1.5 is still supported for them but not publicly because we mandate that we ship everything as Java 1.5 and compile it to make sure that our clients can use an older VM I certainly understand that but it's kind of an answer to the Debian question I almost wonder seems that the precedent in Debian is to always be looking forward and moving forward but I mean this is at the moment it might not matter but at the point we start getting 1.7 whatever do we want to have a policy on this Could that be expressed by the the jerry depends if you said Java 7 runtime or something So there are a couple of issues with this first of all yes and if you build with Java Helper and use the automatic dependency generation it will look at all of the classes that you should work out what the highest class file version is and use that to generate the ultimate dependency list however if one of the if one of your libraries if you are compiled as 1.5 and one of your libraries is compiled as 1.6 then it can't necessarily tell this I think I try and do some recursive stuff over class passes there but that's the sort of thing where it's more of an issue because a Java library doesn't have a dependency on jerrys so it can't express that it needs a particular version so not on the Java policy per se but besides the policy do we have a package or do we want to start working on that we definitely should have one like I said my next talk will have the links to it but on packagejava.aleof slash examples which isn't linked anywhere but will be shortly there are lists of example packages to say if you have a library that's being built and you want your devian packaging to look a bit like this I think we want to add some more examples there sadly I don't know anything about Maven so it's hard for me to write those but if people want to contribute those that will be great we do have some stuff on the wiki and our aleof pages which try and give you some packaging guides I think quite a lot of those are outdated we should really be working on to try and get them up to date the devian policy normally is in sync with the guides so it will be nice to have both similar documents or very close by quick question there's been an open bug against devian policy for some time filed by someone who I don't think was actually directly involved in the java policy discussion at all asking whether the java policy should be incorporated into the sub policy of the devian policy as a policy maintainer I don't really care if you guys are comfortable maintaining it as a separate in java common that seems fine to me but I thought I would bring it up here and ask what people think they would like to do what happens with other language specific policies because I know Pearl and some others have these sorts of things currently the only language specific policy that's maintained as part of the policy package is the pearl policy although there's also a few other things like the depconf policy that's in there once it's in the policy package right now all the policy all the policies in the policy package are maintained via the same procedure which means somebody proposes language after some discussion it needs three seconds where the person proposing a language is counted as one of those three and then gets incorporated from the next release we can set up a different procedure for our sub policy if there's some reason to though so we don't necessarily need to follow that same procedure and it's more I think it's more a matter of convenience of whoever's maintaining it I don't have any trouble having it be out of tree the emacs policy has been maintained with emacs packages for years and not part of the actual devian policy package but some people do seem to have a perception that if it's in the devian policy package it's somehow more official well I think we're now at the point where it's not so outdated it would be embarrassing to think it from devian policy so I think I think at least having devian policy mentioned it even if it's out of tree would be a good thing I don't have a particularly big view on whether or not we should integrate it I suppose it's interesting to consider whether the violations of sub policies are RC by default and whether or not having incorporated in devian policy rather than out of tree effects that violations of devian policy are not RC by definition either Serious violations are serious which is RC by default officially the release managers set RC policy and if they decide that something even if it's a must in policy is not RC then they override that but yeah I think that in practice everybody feels like the language policies if they are believed to be stable and fairly complete for example I know the python policy has gone through various iterations and is not necessarily in that state right now that they should be treated like policy whether they're in part of the policy package or not I mean if the language experts in devian say don't do it this way and someone does it that way then I think that's a RC bug by default so I don't think it really changes anything very much there but it might change perception that's always kind of fuzzy it might also change how easy it is for people to find it which is a bigger thing as well generally if it ends up in the policy package the devian web team not that I've explicitly talked about this or anything but generally it ends up in that section on the devian website as well which helps people find it I have vaguely remembered that in the new maintainers quiz there are some devian and java related questions that when you take an exam to become a devian developer they ask you stuff related to the devian and java policy which actually is no longer true but there was so I mean there may well be examples in there but I think the language specific stuff is generally if you have somebody who's working on this language they should be able to ask questions about it other people not necessarily I mean certainly the recent style of running an M is that you can look at all of their work and that should answer most of the questions about having to actually ask them it used to be the case and I don't know this because I was from this it used to be the case that an M did ask a few questions about language specific policies but this was more a way of trying to figure out whether people knew what they were and where to find them rather than to see whether they knew how they how they were set up and because it was often misunderstood they've actually changed that now so it's not the way it used to be anymore Should we include anything about how to set up maybe in repositories on the policy? I think it would be useful to mention Maven more than we do specifically around if you have a Maven policy a Maven package then you should be using the devian Maven directory and these sorts of things I actually have a note from when we mentioned it earlier that's I think we should include in there we want to be careful to specify only the behaviour it should have and not necessarily what tools you should be using to do that I mean that's something we should have guides for but I don't think it should be in policy And what about trying to enforce manifest with class paths? If when we come up with a solution for recursive class paths whether that be in as the class path entries in manifest or anything else that definitely should be specified in policy and we should mandate this because I don't think it's reasonable that we should expect everybody to know about all of the recursive dependency tree below them They should only have to just include in their class path the jars that they use and if those jars require other jars and if I don't know what's people's opinion should we require this with people using class path manifest entries in libraries or anybody think there's a problem with that? Well I mean you know it when you build it well you have execution requirements if you are getting things through reflection you may just not know it so that's kind of a different issue because if you're doing that then you will then the jar will that will actually do the loading for you so you don't actually need it to be in a manifest entry it's only if you're expecting somebody to put it on the class path you shouldn't be doing that so either the jar should I disagree I mean if you're doing reflection to get the stuff you still need it on the class path Well yeah okay so there are situations where you do need it on the class path like that I think if you've got your own class loader then you can do it yourself and you don't need it on the class path if people are doing things where they need to have something on the class path but it's up to the program which is using the library as to what that is then that is a reasonable thing for the library to have to know about the program to have to know about so that's not necessarily something that should be in there In a sense but that stuff end up being a dependency of the library so the maintainer has to have some idea that that stuff is required to run the library but I think it's actually in the same way that I think it's actually something that's also a dependency of the program using the library because the library says you have to use me with one of these things you have to determine which of these this is and you set that up by setting it in your class path that is a reasonable thing to expose up but where it's you need this jar that should be handled by the jar in question so if all the if all the jars are living in users share Java then my assumption is that most JVMs will can use the manifest in the jar out of the box and that will work but where this breaks down is if people have private jars right you can put full paths in I believe you can put explicit full paths in manifest entries as well as relative paths of course where I'm going with that is that I would hate to have a wrapper have to do the inspection the recursive inspection of class path that construct a correct class path before launching because that would be really slow well that's the JVM will recursively and numerate class paths for you if you have a jar which you put in your class path and then somebody puts you in their class path the JVM will load both you and the things that are in your class path right and much rather what happens in the JVM rather than in a shell script at the moment that happens in shell scripts you have to do it manually putting it in a class path entry in the manifest the big issue here is that nobody actually puts a class path entry in that manifest as far as upstream if you come back to the talk after lunch then I will talk about packaging helpers which help you fix up these sorts of things we can write tools to do this for you it's a naive question can somebody explain to me the level of multi-arch support that is targeted for squeeze how far are we going with multi-arch is there anybody in the audience who happens to know this better than I do I'm not sure of the exact status I talked to Steve a little bit about it yesterday he said that he thinks it's pretty much ready to go into squeeze with there's some standardization required for to figure out exactly what to call the paths but the overall design is pretty much done I believe the d-package support is pretty much done and the libc support for finding the stuff is basically done so I think there's general agreement on where to install the stuff and the only hard parts are things like hard versus soft load on ARM how you name the directories appropriately so that you don't end up colliding so I think that most of the work there has been done on C libraries and that anything for any of the other languages is harder I believe that the intention was to defer figuring out what to do about Perl models for example until the next release after squeeze so Java may be in a similar situation there I think the idea was that the d-package and C library all of the support stuff is in for squeeze and a small number of the most relevant C libraries will be set up to work with multi-arch but everything else will be sorted out to a future release and I'm sure that one of the goals would be to get rid of the AMD64 giant collection of random i3d6 package libraries package I think that's what they're mainly aiming to sort out so are there any other questions or shall we all go to lunch a bit early cool I think we're done