 OK, hi. Good afternoon. The next talk is coming back to Chick-Saw a little bit. We have seen the talk by Mark Reinhold in the morning. And this is somehow not going directly on how to implement something with Chick-Saw. It's going more on if you have a current project and you wanted that it also runs fine with Java 9 and Chick-Saw because there are always problems if you like your importing external libraries and all that stuff, and suddenly something breaks. And I'm originally from the Apache Lucene project, and we did Java 9 testing, including Chick-Saw. So since the early beginning, when the first release was out, the first preview release. So we know what you need to change at that time and how you should not write your code. Because if you do that, it would break with the module system. So this talk will not be something like how to convert your current project to a module. It's more how to make it work with the module system and everything around. Of course, Lucene will at some point also be available as a module, but currently it's on Java 8. So we have not yet any plans to change that maybe with version 8, which would be next year at some point. We might change to that. OK, so what changed with Chick-Saw? So the biggest problem for already existing code is that the module system changed the runtime in a way that you cannot simply access any API which was previously available in the runtime, private APIs, public APIs, where you want to access special private members, and all that stuff. So it's very restricted. So your code only sees classes from packages that are exported to your code. If you have a legacy project which is running using the class path, this is a so-called unnamed module. And there are some restrictions on what the unnamed module can see. And the second thing is, which is a recent change in Chick-Saw, which came with 148, that also in public APIs, it's no longer possible, for example, to make a private member of any runtime class like looking at the byte array behind a string. It's no longer possible without some additional exceptions that are thrown, or you have to pass command line parameters to disable those checks. So if you have code, for most of the case, it looks like you were running your code with a security manager where everything is disallowed, that has to do with making classes available or loading classes and all that stuff. So this is somehow, it's not in reality, it's not like that, but it looks like that. So the first thing is in our compile time problems, which a lot of very legacy and old projects know, you have something like Java code, which was using SunMISC APIs. One famous example is SunMISC Base64 encoder and decoder. Of course, SunMISC Unsafe is also here, but from other projects I know there's also come some Java FX, where a lot of private APIs are used by a lot of projects out there. So if you compile code that directly references in the source code, not by a reflection, something like you're having an import on SunMISC Base64 encoder, decoder, then it will throw, it depends, it could be a class not found exception of course, but it could also cause an illegal access error. So for example, there's one bad example, makes no sense at all, it was just something I tested. So if you use that SunSecurity sassel, I don't know what it is at all, provider, which is restricted, you can compile that code with Java 6 and Java 7, but if you run it with Java 9, you get this exception, illegal access exception, class test one in unnamed module, which is our class pass, cannot access this class in the module Java security, because the module doesn't export. So this breaks. To figure out what's wrong here is relatively easy, because you can scan your code with a JDEPS tool, which is available since Java 8, there's also Maven plugin available for it, and this makes it relatively easy because you immediately see what's wrong. The second possibility that also works with older JVM, a Java version, is the forbidden API splug. In both tools have a little bit different background. The first one is more for figuring out dependencies, and the second one is to disallow your bytecode to execute specific methods, but for that use case, both do the same, and for the second one you have this JDK non-portable switch, which means it disallows you to access any class name that is part of the Java runtime and is not in one of the public modules somehow. This also is available as a Maven, Gradle, and Unplug in and works since Java 6. So here it's a little bit slow, but here's an example, the buff code using the provider, it throws a warning already, but most people, especially if you have some imported external char dependency, they compiled it at some point, published it on Maven central, ignored that warning that came here and then you use it in your own project, and because of that it breaks. So you can check this class with the JDEPS, it returns you that it uses the internal API, but the same happens if you use the other tool. So those are two possibilities to figure that out if it's in the static in your compiled code. The problem with that approach is it won't help if reflection was used, which is the case for most tags. So what about reflection? Yeah, so clever people use reflection to access those private and internal Java APIs, of course, because the JDEPS 2 says you should not use that, so maybe we do that somehow with reflection and try to do it, and if it doesn't work, we don't do it. Good idea, works in most cases. The pro is that you have no compile time dependency to any Oracle HADK, so you can access some misgun save and it works, but sometimes you also need to access private methods or fields in that case so you need to also access private members of those classes. So that's the problem with reflection hacks. The first one could be that you get a class not found error when you try to load the class in your reflection code, and the second one is something that you are not allowed to make something visible in that class. So the problem with downside is that static analysis won't help for that because you wouldn't find any direct references to private APIs. And much worse, what we have seen on a lot of projects on Maven Central when trying to import them is that they have some reflection stuff that works, but it only works with Oracle as soon as you have another JDK, or if you're using it if it's Java 9, it completely breaks because the code is somehow broken. So the first thing is, which is really horrible, is something like huge amount of libraries is calling Z accessible on every Java class out there. So for example, scripting languages like Ruby are doing that by default for some reasons, but it's really horrible because in a lot of libraries they are doing that. And the problem is it's not always done correct, and because in that case they also miss to do something like access controller to privilege when doing that. This has nothing to do with Java 9, but it's still something which you will see would be helpful in the next case. So because people forget to do something like a correct try catch, so the most horrible thing is a static initializer of a class that just does an E print stack to A's when a reflection error happens, and then a little bit later you get a null point exception because some variable was not initialized or something like that, or they wrap it in a runtime exception and do nothing anymore. The problem with that is in static initializers is if there's no alternative solution, the static initializer breaks, and whenever you try to access a class afterwards, you get a no class that found error. You get it forever. And the problem is there's a static analyzer throws an exception that it cannot recover from, and we have seen that in libraries for accessing Hadoop from Amazon, we have seen stuff like that in Apache, POI, and a lot of libraries, it's done like that. So the second thing I wanted to talk about was the set accessible part because in early Chicksor versions, you were still able to do more stuff than you can now since build 148. So there was a problem because, for example, you have the public class Java lang string in your runtime, and this one is exported to the unnamed module or any other module. But still, this user code, because it's accessible, you can see it, would be able to make, for example, the char array or the byte array behind the string visible by calling set accessible on the field of the byte code, of the byte or the char array. And to prevent that, in build 148, this was switched off. So if you want to do that, you have to explicitly export somehow the reflective access to the outside. And of course, for almost all public APIs in the Java runtime, this is not allowed. So with build 148, it's no longer possible to access the byte array or char array behind the string. So what does no longer work as a conclusion? So class for name on any class from non-exported packages does not work since the beginning. The second thing is accessible, objects that accessible to does not work anymore in any public runtime class. There are some exceptions on that, like, for example, SanMisq Anzei, because to get the SanMisq Anzei, if you need to make the field array level, and that's explicitly allowed in the module definition of the Java, I think it's a JDK unsupported module or something like that. So you can still do that on various classes. So the problem with that is there is no tool detect those types of reflexes access to private APIs. The only thing you can currently do is, for example, use the previously mentioned forbidden APIs plug-in and simply this allow to call this method anywhere in your Java code or in libraries. But of course, this also breaks your own code if you have something like tests doing, making something, some private fields accessible. So, but it's not the best solution. So, but there are some solutions that you can do which works quite good. It works not perfect. So run your tests with an enabled security manager because by default all access to those classes which are private or not public is this allowed by the security manager. So this is, for example, what Apache Lucene, Apache Zola and Elasticsearch is doing. All the test suite is running in a security manager. Of course, we have to open some access to maybe reach system properties and all that stuff but you should not allow something like accessing SunMisk classes because this is prevented by the security manager by default. So the exceptions you get are not the same but it helps you during the runtime to figure out if some of your bundled or dependencies doing some crazy stuff inside. Now, finally, there are some important patterns how you should write your code. So one thing is, the first thing is if you do some hacks like that, if you really need un-save and do something with it to make something accessible or anything, anyhow, please add a fallback for that. Don't just fail in your static initializer. So in most cases, there's a fallback available. Maybe it's a small slowdown caused by that because you cannot directly access that perfectly fitting internal feature or you cannot work around the bug but please don't do that. And the second thing is, which especially affects the set accessible is catch security exceptions. You should always do that when doing reflection but also catch runtime exception. Why runtime exception? This is a little bit special because the set accessible call throws inaccessible object exception when you try to make such internal classes accessible with set accessible. The problem is inaccessible object exception is a new one in Java 9. So if you have a project with Java 7, Java 8, you cannot catch that in your code so you have to do something like the parent class like runtime exception and then do some string checks. I know, for example, Groovy is doing that when they are trying to access all the private members of classes. So they catch runtime exception and then do a comparison on the class name. It would be much better if it would be a security exception because most code would hit that but unfortunately security exception does not fit the use case here. There was a longer discussion on the mailing list on the OpenJDK mailing list about that so you have to live with that. And the other thing, as I said before, don't fail in your static initializers. So if you have something like that, save your error details while you are trying to initialize the private API hacks like accessing unsafe. Of course, use access controller to privilege because especially in static initializers, they can be called from anywhere in the code so suddenly some class is accessing that class and loads that class so the static initializer is running and because of that, if you have a security policy you cannot simply forbid anything because it would also affect your own code. So by using two privilege, you're making that local to your class because then you can have a security policy for example to forbid that. And if a consumer then is later using your library that calls any of these internal features and for some reason you cannot initialize that at that point somehow replay the original error message you got when you try to make the private APIs accessible so something that unsupported operation exception or something like that. Another thing that's very useful when creating code that is hacking around unsafe and all that stuff is not to use poor reflection for it but instead to use method handles. There's not really a difference between those two when doing that but when you create a method handle it is like Java C is compiling it and type checking the whole thing at the time when you create the method handle not when you're calling it. So once you have the method handle you know 100% that it works. Of course there could be something behind the scenes that would not work but you know that all the types you're having your unsupported API are 100% correct and you can also do something like programming logic with SCART with test and all that thing and when you finally call your hack you will not get any linkage errors. Here's some example which is quite famous because it also broke with 148 builds is Apache Lucene's map bike buffer unmapping. So this is also used by other projects and this is a long ongoing story that's an issue from 2005 I think Mark. Yeah, about that it's a memory mapping. So if your memory map some files or if you have direct buffers which are off heap you have to wait for the garbage collector until they are unmapped. And to work around that a lot of projects which really need back the resources are forcefully unmapping that stuff and there were some tricks about that. The issues here is for example the migration to Java 9 is in this Lucene issue and this is the really old bug on the OpenJDK bug tracker but as I heard with Java 10, 10 there might be a solution now. Let's talk this evening maybe on the dinner about it. Yeah, I know Andrew. He already finished it? No. Okay, yeah, yeah. So it's a little bit small. So here's the example for example of the original code of the unmapping. So basically this is using as I said before methods handles maybe it's a little bit small. It's using method handles. So basically you need to previously in Java 7, Java 8, until Java 1 you need to access some method which is called cleaner in Java Neo Direct Byte Buffer which is a package private class. Make it accessible at that point. And as you see it's a public API. It's in a public package Java Neo. And if you do that, it's the newest Java 9. This will fail with inaccessible object exception. And then it's doing something. It's compiling something like an unmapper for that. It calls the cleaner. It checks if it's non-null. And if it's non-null, it calls the cleaner. Otherwise it does nothing. And this is then done as a method handle. So it's something compiled before. Though you don't go through the reflection root and do it one step after each other. So you have something at the end. You have a method handle that does the unmapping. And of course with Java 9, this need to be changed a little bit. So they added, this is one seldom reason. They added a new method to some un-save to invoke the cleaner just for that. Because un-save is still available in Java 9, at least until Java 10. And then you can still call that. You get something like the same method handle. And at the end, here you see parts of the error handling. You see here it catches security exception. And if it gets a reflective operation exception or a runtime exception, then it does the Java 8 stuff. So this is how it initializes. And in all other cases, if it doesn't work at all, then it saves some error message which is then returned to the user if you try to enable the unmapping in the Lucene code. Yeah, that was it, I think. One question or no? OK. So how many libraries did you touch? How many problems did you handle to modify? So there are a lot of libraries I only know from Apache, Lucene, Zola, and Elasticsearch. So Lucene does not use so many external libraries. Zola is using some 100 libraries. Everything that had to do with Hadoop does not work with Java 9 at the moment. As far as I know, there was also Apache POI, which has another pack which was trying to figure out the version number in a static initializer. And it searched for the first period, which of course broke because the new version number only starts with 9 and nothing else. So there is something. And for example, in Elasticsearch, also Amazon libraries for accessing easy 2 and all this stuff. So there were a lot of stuff around that. And the other stuff is, of course, Groovy as a scripting engine. So Gradle did not work anymore with the recent Java 9 versions because of such issues. So what's that something working? Yes, it was something working. Lucene was working. No, no, it's working. I think Maven also had some minimal problems, but they are currently fixing. But it looks like some of those build tools aren't, interestingly, is working. Only Maven and Gradle do not work. But the original aren't, doesn't have any problems with Java 9. Maybe it's because the JDK builds itself with aren't or parts of it. Yeah, no, that was just a joke. But they are fixing it. So it looks like you need to update your toolkits when you go to Java 9. Otherwise, it might be possible that you cannot run. So Gradle 2.3 don't even try. OK.