 Thanks a lot for your patience. So I'm Florian Haftmann. I'm working for the IT department of the city of Munich. And what I'm actually talking about is we'll hopefully become clear to you in the introduction. So just the background, the municipality of Munich is running on an Ubuntu-based Linux system on 18,000 machines with 33,000 users actually. And our leading office platform is LibreOffice 4.1. We also have all the instances flying around. These are still based on OpenOffice 3.2.1, but these are decreasing in numbers. So the planning is that in half a year, everybody will be moved to LibreOffice 4.1. Let's see how this will turn out in practice. So how did we actually get there? In 2004, there was a fundamental decision of the city council of Munich that over the whole administration, a consolidation and transition has done from proprietary desktop environments to a consolidated desktop environment based on free software. And this was set to life using a strategy based on a soft migration. So how did this look like, actually? So this house is now some simple 4.0.1.1 infrastructure. And this is your ultimate goal on the baseline. You want to set a consolidated operating system. But before you can really achieve this, it's better first to consolidate your office platform for two reasons. One is more of an organizational issue. If you want to have a consolidated operating system platform, you need some kind of infrastructure. Software distribution, etc., etc. And if you first start with the office migration, you gain some time to build up that infrastructure, actually. And the second one is a social psychological one. Still nowadays, the office platform is the user's main window to the system. So if you first consolidate the office platform, the user is still working on its previous operating system. And if you later on exchange the operating system, the transition is not that hard, it's smoother. And on the transition to a consolidated operating system, the users are already acquainted with the new office platform, and so on, and so on, and so on. But the office platform is not standing on its own. It stands in relation to various pre-existing office applications, which are connected to certain administrative or business processes. And so if you want to consolidate the office platform, you have also to consolidate the applications. And that's a good opportunity to create, actually, added value for your customer to promote acceptance of your office platform. To go a little bit into detail here. Before the migration started, the situation was more or less this. On the baseline, there were various versions of a popular, proprietary office suite flying around. I don't think I have to be more insensitive here. And on top of this, there was really a rich zoo, and I really mean this literally, a rich zoo of various office-based business solutions. Some were small, ranging from simple templates, but not consolidated templates. Each department in the city with its own sets of templates, forms, and also macros. Any kind of stuff you can imagine. After the migration, at the baseline, there was now a free software office suite with a managed lifecycle, a centrally managed lifecycle. Using this as a baseline, an extension application for OpenOffice LibreOffice has been developed. There's so-called hallmarks, which is often advertised as a letterhead system. It augments your base office functionality by a more accessible main merge, for example. And it supports also typical processes in a public administration. For example, most of you maybe know this. You have to fill out the sheet, and then first head into that department. And then from there, it goes to that office, which puts a stamp on it. And then the third office ends on. In Germany, called Sachleit mit Verfügung. This is especially supported by Walmux, for example. And so this was really turned out to be a strategic key component in the whole story. And on top of this, there was now the opportunity to provide consolidated templates all across the city. And which might be not very surprising, there were some office-based artifacts flying around, some macros or whatever, which just turned out that office was not the platform of choice to implement this. So this could be migrated, for example, to web-based applications. And mostly, this has just been database applications. There has already been infrastructure in the city. So this could be easily get rid of. But now we're in the rightmost corner. So we started with 21,000 artifacts. And now we still had 100 artifacts which could not be absorbed by Walmux, so it could not be absorbed by templates, but could not be migrated to web-based applications. Very specific things, actually. A small percentage, but still wasn't. Just to give you some examples, mostly spreadsheets with some macros supporting parking lot management, letters to parents or dog tags, just to give you some example. For those who knew a little bit administrative German, put the original words also inside there. And now this is actually the explanation of the title of this presentation. I used the term macros enormously for a lightweight office-based business application. Why this definition? First, it's a lightweight thing. I don't want to implement heavy-weight things as a macros based on office, just that doesn't make sense. It's also office-based because otherwise I wouldn't have to take care about office at all. And it's also a business application, so it's not a simple thing to think. You need to understand what the business process behind it actually is to really implement it. And in the course of the migration, for this hundred specific things, there was really the decision, okay, we will stick to macros actually here because we have no difference taking this emotional attempt. Yes, and actually this tiny obstinate core of macros is what we still have until today within the IT department. And this is actually what the whole story is about. A small piece in this whole migration scenario, but a piece which is really a little bit obstinate because it makes a lot of work. So, if you ask people who are just working in the IT sector or on, and tell them that you're engaging with office macros, these are a couple of possible answers you might get if you just ask them for casual advice. Well, why do we care about that stuff at all? If people start to implement macros, it's there only as a policy, constantly and keep away from that mess and so on and so on. So, why do we care about macros at all? The issue, it's an acceptance issue. It's again a social issue. If you come with a consolidated LibreOffice platform, there are a lot of customers which have, let me paraphrase it that way, quite selective expectations towards a particular office platform, yes. But, if you are able to solve your customers' business requirements in an office service, he doesn't complain about the office platform at all because he just want to get his work done. If you can't fulfill his expectations, he will maybe blame the office platform, but if you can fulfill his or her expectations, but then the platform does not matter so much. And this is why we actually care about macros. And this was more or less the technical state after this initial transition, after this initial consolidation, which was done by an external service provider. So, we have about 100 macros, actually. How do I distribute it? Well, I actually downloaded from an internal wiki and a city internal wiki. We had no integration with the automated software distribution, so I did notice some kind of click and safe deployment to have a page downloaded and then you can work with it. Of course, most times, the administrator is actually doing that. Why don't we have an integration with the automated software distribution? Well, you might remember when the office consolidation started, there was yet no central software distribution, so it was just a pragmatic choice in the first instance. And technically, we have both extensions, office extensions and macros embedded into documents. Implementation language is mostly star-basic with two exceptions, which are actually implemented in Java. User admin and developer documentation is at hand. And the macros are rather different in what they actually model, but the technical design is quite uniform. There's a uniform interface to make them configurable in a certain band, to access, for example, the databases and so on and so on and so on. Okay, so this was the state after the initial immigration, but then the real life starts, the life cycle. So you have to maintain it. Why do we have to maintain it? Because the office platform is also evolving. And this will inevitably create incompatibilities and you have to take care about this matter. And there's an image, so the errors are really a little bit thin here, but it doesn't matter. Have a look at the deployment change. So the three stations actually, architecture is everything relating to design and so on. Development is the work where you get it really running, get it tested and also accept it by the customer and deployment is how you really ship it to a customer in production. And if you have a look at it, there are major pitfalls working inside here. So I will start in a minute because in my view, that's the most critical one. Especially in embedded makers, they always have this intermeeting of logic and data. So you have a document and the maker is some insight, and it's not clearly technically separated, which is very annoying. For example, if you have to provide your own test data and then you have to copy and paste it somehow and so on and so on and so on. And also it's not entirely key, but they really well define user interface for macros in the first instance. And also you can save deployment is not very fruitful in an environment where you have an automated software distribution. Actually, you might think, well, if the admin has to download it, it's okay, but the problem is that then we are losing essentially control, but then the macros are somehow floating around, maybe with some arbitrary changes and sometimes later this will give back at a certain moment. So what can we do about this, actually? Yes, I already told this. So starting with the architectural design. So here we agreed with upon ourselves that the well-defined user interface of a Macro may consist of a dedicated menu, a dedicated button bar, or a dedicated set of document templates, which are presented to the user if it just allocates a new document from an existing template. And concerning implementation, they need this key preference for standalone extensions, but we still have to tolerate the existing embedded macros because we don't want to internet them. So with the screenshots, I think it becomes quite obvious, think about the Macro, it might actually add its own dedicated menu, a fire-peach out, it has something to do with a fire police, yes, fire police action. And this is later to parents, you just installs its own button bar where you can click on it, they get the dialogue or whatever. And Macros also provide just a set of templates, which is available using the file in new dialogue. And these are well-defined user interfaces, which might have some piece of dedicated software actually. So concerning the intermining of application logic and data, we really had to develop some weapons for this. So the solution is plainly that we store extensions and documents containing Macro in extracted form. We all know of the document, it's just a zip file, you can answer it, but we want a little bit more actually for this reason, we have developed our own small tool, which is available on the net also. What does this tool do beside unzipping and zipping? It does some consistency, it checks like is the manifest well-formed and does the content conform to the manifest and so on. But the really benefits are that StarBasic creates a stored as real plain text. Natively, StarBasic is embedded into XML again, which is not very convenient if you just want to use your favorite editor on it. So it's embedded as real plain text. You might ask why we didn't use a flat open document format which just stores everything into a big unzipped XML. The problem is that you still have an intermining there. So you may have some binary data in there which is just encoded at base 64, I think. And then if just some metadata in the document changes, this somehow shows up in the diff and you just don't want this. And another benefit is if you have special technical situations, for example, Microsoft with additional Java sources, you can just needlessly embed them there. So the issue is you have one build tool and then this is able to build all your Microsoft actually with one, it's a clear repeatable description of how actually to get from the sources to the running extension, to the running micro, to the running document, whatever. And with this plain text, okay, no, this is just for illustration. So these are maybe the typical example sources in an extension and you see here, all the star-basic stuff is really saved in dedicated files in the plain text with a dedicated extension. No XML, at least not for the source code. Well, and this is just the source code and it's really literally there also in the files. And if you have it really literally in the files, you can use all your defaults who you use for development versioning, for example, Git, you don't want to live without it nowadays. You can also easily do full text analysis. If you know, well, the next LibreOffice version will have an incompatibility in that, this or that, you know, interface. You can just grab for it, among 100 micro actually. And also if you have to provide a different test data, you can just plug in alternative content which you don't deliver later on. So you have a rich set of possibilities to apply here during development to get your work done. As it would also be used for more other programming environments. So concerning the deployment, but the answer is quite simple. Our Linux system is Debian based, so we're using just the Debian packaging system. And there is a canonical way scripted from a macro, out of a macro Debian package is produced which installs the extension, and if it's an extension macro, it just installs the templates into the appropriate locations of templates are present. And it also includes the user documentation as a PDF, so it's self-contained if the macro is installed and the system that of user documentation is also attend without any interactions. There is something which we also do with myDepisteRange in the first instance for embedded macros, we actually encrypt the source itself. So the macro is operative, but you can't inspect the code and you can't change it. Why are we doing this? It seems strange, especially in a free software environment, but it's not about freedom, it's about control over software. So at least in theory, it's the recommendation that through all your organization, there's one authoritative point, but actually a software, it's one authoritative repository. For office extension, that's not a big problem, they're just installed in the system space and the arbitrary user is not able to change it somehow, but for embedded macros, they're just floating around in the user space and we want to prevent anything that unconstrained changes happen. There are unnoticed changes because later on, if there's a platform problem, we just get some tickets with an attached file, oh, there's a problem with the macro and you did it, prepare it and then you open it and you have first busy analyzing what's actually the relation of that, what you get, what you got, to what is in your repository. That's the plain reason about this. And it's automated, so this encryption only happens during deployment, you don't do development, you don't have to care about that. It's just a small tool, you can encrypt macros using Uno actually, works out quite well. And of course, everything is automated, so there's a dedicated automated way to get from your sources to a package. So the sources are the ultimate authority, it does not depend on your daily math, how actually the macro looks like afterwards. So just a short stop taking. What are the really vital elements in our deployment chain? The architectural decision that we avoid in that macros and that the macro has a well-defined user interface. Concerning deployment, the source storage with the corresponding tools and the automated deviant package for deployment, this is a quite essential, which we have developed and which, as we hope will help us really to maintain those macros in life cycle. Of course, there are always missing cases, always. You might remember most macros are actually implemented in StarBasic, you could ask the question, well, mostly it has historic reasons. Of course, there are a lot of different technologies, but if you want to explore alternatives, it's not only implementation language, it's also possibilities to debug. So in StarBasic, you have at least a very simple embedded integrated development environment where you can set breakpoints and watch something. How good to do this if you implement the macros and Python Java is not that clear in the first instance. So you would first have to evaluate how really good this turn out in practice, but it's still a question, which I think was the following tree. And concerning the deviant packaging, there's still a missing piece. You might remember I told about configuration interface for macros where you can configure external databases and so on. Currently, we have no support in the software distribution for that. So, of course, the software distribution supports parameters for packages, but there is no connection to the macros packages yet. So currently, the Admin display test is due by hand and you can imagine that somewhere in the future, it might be desirable to fill the gap, actually, that the software distribution really brings the macros to be a machine pre-configured as freedom is set up by the administrator and heads to it. Okay, and what I have presented is by no means a concluded story at the moment. In some sense, we are still in the beginning. The upcoming challenge is to transition to LibreOffice 4.0, which we would, which we plan on hope to be accomplished somewhere in the next year. Let's see how this will look at work out in practice. So you might remember, I talked about 100 macros which were a result of an initial immigration. We don't know yet how big the demand is that maybe further macros have to be edited. We somehow want to avoid this, but maybe we are still in a similar situation which has come to avoid it for reasons of the big picture. And I already mentioned this also, that technology is still some room for exploration. Maybe a student project in the future can shed some light on this also. Okay, just to come to a short conclusion. I can only emphasize this. It's not enough to provide an office platform. We have to provide an office service. You have to think how can I really match the business environment for your customer. These are just the links to the tools, all books, the product tool for the source storage and the utility, which does some other things. If you are in similar situations, maybe there's some turnout useful for you. You can request some ideas from them, maybe. But if you just want to talk to us or to conflict us, here's some good advice. Thanks for your attention. Do you discourage people from wanting macros? Do you help them to write macros and then to consolidate them back into the main system? How does that work? So far we didn't have a proper example of that, but of course we don't promote macros as the primary implementation technique. So the idea is you first have to understand what really the problem of the customer is and it shows you, or she shows you some calc sheets and say, well, I want to relate this column with that column, then you have a rough idea of a stick for making places for that program to write for a platform ever. We still have to explore how this would turn out in practice. We are trying in cooperation with the external service provider was to build up a generic service for like with business applications and how many of them actually turn out to be necessarily office-based, is still done now yet. But of course we want to avoid really that insuptable implementation spectrum. Maybe we can tell one next year, let's see.