 So, hello everybody. Again, I'm Cindy Chicolise. I now work for the Wikimedia Foundation as the product manager for the Media Wiki platform team. And I'm very excited to be in that position. And I'm going to talk a little bit. My talk is called Towards a Media Wiki Roadmap. It's not called a Media Wiki Roadmap because I'm working towards creating a Media Wiki Roadmap, and maybe that'll make a little more sense as we go through my talk. So I'll start a little bit with some historical perspective. As I said before, I worked at the MITRE Corporation for 10 years-ish prior to joining the Wikimedia Foundation. And in my role at the MITRE Corporation, I created wikis for our government customers. One thing that I got asked several times as we were proposing Media Wiki as a solution to their knowledge management needs was, but this Media Wiki thing, is it going to be there in five years? Is it going to still meet our needs? Is it still going to run on the platform that we need in five years? These extensions that you're creating for us to Media Wiki to do things that are, you know, allowing us to visualize our data, are those going to be still there in five years? And the extension question was an easy enough one to answer because that prompted us actually to start open sourcing our extensions to make them available so that even if we, MITRE, were no longer involved in helping folks to maintain their wikis and grow them, this code, the extension code to Media Wiki that we created, would be out there in the ecosystem and other people could maintain it. And I don't know why it's doing this. Oh, wait. That went ahead. Stop. So the extensions would still be available. And they could edit them. They could maintain the code. They could hire other contractors to do so. But the base Media Wiki platform, I kept on saying, you know, the Wikimedia Foundation, you know, I need somebody out there to tell me that Media Wiki as a product, as a piece of software, will still be there. And as open source software, of course, we know that it will be, right? Because you can always, well, I won't say the fork word. But, you know, you always know that you can, you have access to the source code and you want it. But you'd like to know that this wonderful diverse community of software developers will continue to be working on this product and that they won't suddenly say, oh, well, you know, hey, this PHP thing was fun. But I want to now run on the grapefruit operating system using the Pimento web browser or web server, you know, something that's completely, you know, so as a, so me in 2008 to 2016 was saying, we need a roadmap. And I still agree that, you know, this is still me, but it's me in the past. Yay, we still, we need a roadmap. We need to know what layout, what the future of Media Wiki is. So I was really excited in 2017 to see that the Wikimedia Foundation was hiring a product manager for the Media Wiki platform and saying things like this person should be responsible for the roadmap for Media Wiki. That's excellent. I want to do that because I think that that roadmap is so important that I want to be that person that goes and does that. And oddly enough, they decided to hire me even though I was one of those third party community people. And I now work at the Wikimedia Foundation and that roadmap thing I wanted for all those years is my job. And wow, roadmaps are complicated. So me in 2017, I'm trying to pull together all those pieces and create a roadmap. And this is very much a work in progress. And so this talk is toward Media Wiki Roadmap. I will talk a little bit about the information that I'm gathering, why this is a hard process, and what it is I hope to achieve in terms of a Media Wiki Roadmap. And I would love to have all of your input also from your perspective on what would be a useful, what would be useful information to have in a roadmap and what form that roadmap could take that would be useful to you. So what are some of the things that make creating a roadmap for Media Wiki in particular difficult? And I'll talk a little bit about what we're doing in the future and what we've got in progress. But the Media Wiki architecture in some place, you know, you could say, hey, you know, lampstack, pretty straightforward. But when you talk about all of the components, the pieces, you know, how would an extension, if you were to create an extension, reach into the core code and change any given thing? Well, there's this great hook architecture and it is documented, but it can be sort of difficult for folks to visualize the entire thing. So it is quite a powerful architecture that's not completely documented and is a big moving target and it's continuing to change, which is great, it's changing in great ways, but it makes creating a roadmap a little bit more of a challenge. What are the boundaries of Media Wiki? You know, what do we really, when we're talking about the Media Wiki platform is when all of you are talking about using Media Wiki, and you as third party users probably see a different boundary than folks that are using Media Wiki to support Wikipedia at the Wikimedia Foundation. What actually is part of Media Wiki? Is it the extensions? Are they part of the core Media Wiki? If so, which ones? You know, the configuration, how to set up a Wiki farm, that was a big one for me, because I came up with a whole way, you know, scripts and a whole architecture of what the files and directories, you know, and I hated symbolic links so I did it without any symbolic links. Some people do it with lots of symbolic links. Is that part of Media Wiki? Is that structure? Should it be part of Media Wiki? You know, I think that there should be a better defined way of specifying Wikifarms that everybody, because constantly, I think just yesterday, didn't somebody just post a question yesterday about Wikifarms on one of the mailing lists? You know, what is a Wikifarm? How do you define it? There is no reason that there shouldn't be a standard way to set up Wikifarms. But are those, is Media Wiki a Wiki or is it a farm of Wikis? Does it matter? What part of the configuration really is considered Media Wiki itself? Yeah. What parts are core? What are optional components to support? And should there be, should there be several predefined profiles maybe? You know, the Wikimedia Foundation, you know, they, for Wikipedia and its sister Wikis have a very well-defined way of maintaining their configuration that works great for Wikis of that scale. But typically it's not that useful in an enterprise setting. Should there be another profile that people can replicate for an enterprise setting? What about, you know, hobbyists or, you know, I heard somebody say something earlier about how they'd like to set up a Wiki to manage their own stuff. You know, I think, you know, so is that another profile for a single standalone Wiki that somebody uses as their secondary brain to hold their memories, their thoughts? Who's responsible for which pieces of Media Wiki? And that's a common one, and that's one that the Foundation is working on. You know, if you find a bug in something, sometimes it's obvious who's responsible for that bit of code, but quite obvious, but quite often it's not. And so, you know, having a good, you know, and some of the code is maintained by the Foundation, some is not. So it may not even be the Wikimedia Foundation that's responsible for a particular piece of code. And so finding out, and certainly if you're using an extension, is the extension maintained? Is somebody going to respond if you ask a question? So all of these things are complicating. And then, you know, finally, who decides the future? You know, if I were to stand, you know, put a stake in the sand and say, okay, I'm going to by myself create a roadmap for Media Wiki. Well, there's a lot of stakeholders in your stakeholders in this. The Wikimedia Foundation, they've certainly got lots of stakeholders of lots of different shapes and colors. You know, so there are a lot of people who care about the future of Media Wiki, and so I guess that sort of, you know, every time I try to define the roadmap, I want to make sure that I'm not excluding some viewpoint. So at some point, one needs to put a stake in the sand and say this is the roadmap, but in a way where these different perspectives are reflected and supported. So I'm going to do, I'm going to talk a little bit more, you know, I talked a little bit about the present, and I'll talk a little bit more about, you know, sort of where we are right now, and I'm going to reach into the past about how we got there, and then I'll talk a little bit. So okay, it's not the normal order of things, but let's talk about where we are now, and then I'll talk about how we got there. So some current work that's going on in the Media Wiki platform, there's a project. How many people here have heard of multi-content revisions? Maybe a quarter of the people here? So there's some people who know it very well. So I'd say the most active project right now, as far as changing the core Media Wiki code, is something that's called multi-content revisions, and it is a project that aims to, for each page, currently you've got Wiki text, which is your, or well, there can be different, also different content handlers for different types of pages and different namespaces, but typically, like in the main namespace, your content pages will have Wiki text as their format. Multi-content revisions aims to allow you to have multiple slots for each page for a given revision, and different kinds of content can be stored in those different slots. So you may have a main slot that is Wiki text, but then have many additional, more than one additional slot that has data, perhaps stored in a JSON format that's additional structured data associated with that. The project, the primary driver behind the current implementation of multi-content revisions is a project called Structured Data on Commons that aims to allow for Wikimedia Commons, which is a Wiki that hosts imagery and video files that are all part of the Commons, and there is data associated, you know, there's XIF data associated with images, and that data is not easily queryable in its current format in the Commons. The idea would be to take that data and make it, using this multi-content revisions, have that data available in slots associated with it. So multi-content revisions in its current form is first addressing the storage layer of how to store that additional structured data associated with pages in the Wiki, and then future work on multi-content revisions will then address how would you edit these different slots associated with a single page in the Wiki. So what's going on right now is that first step, and that involves some changes to the core media Wiki code and the schema of storage of revisions within a Wiki, and the schema change to support multi-content revisions will be introduced, will be included in MediaWiki 1.31, which will be released this summer. And so again, that's the first phase of multi-content revisions, and I mentioned that in part for you guys to think about, okay, so now that this ability to have multiple slots associated with a page in a Wiki exists, what might be something that you might want to do with this capability? And I can think of a lot of things and I'd be happy to talk to y'all later about that, but you sort of to think about, okay, so now we've got this additional capability in core that may be useful if you want to do other things with structured data associated with a page in the Wiki. Okay, so that was the present real brief and now I'm going to talk a little bit about the past and how we got here, and I have to say Corey Floyd who is an engineering manager in audiences at the Wikimedia Foundation created the next two slides talking about the evolution of MediaWiki Core and so I thank him and credit him with these very helpful slides that lead to a current program called the Platform Evolution Program that we hope to work on in the coming year. So before mobile, 2001 to 2011, this should look pretty familiar to y'all as far as a high level architecture for MediaWiki. MediaWiki has the core code, it has a parser in it that knows how to parse Wiki text, it has skins, templates, extensions, gadgets, okay, really sort of a basic, very easy to understand and comprehend architecture. And then people started wanting to be able to view Wikis on a mobile platform in a way that took advantage of the form factor and the fact that a mobile device is different, has different characteristics than a desktop browser and so an extension called Mobile Front End was created and it was used as a way to allow folks not to have, this was another entry point into the MediaWiki back end that would display nicely on a mobile platform but not custom code, not native code for the different types of devices. Then there was this great thing, everybody wanted to have visual editor, they wanted to be able to edit a Wiki page and have it look, not have this transition back and forth between this back end Wiki text format and the nice front end, you wanted to be able to have WYSIWYG editing. And so in order to support this visual editor extension, a service called Parsoid was created that runs on top of a back end called REST base that allowed partially parsed information to be saved to allow very quicky, quick context switching back and forth between the edit view and the reading view of a page. And so this architecture started to evolve and you can see the little bit more complexity is coming into, but now we have this parser down here which is the Wiki text parser and this Parsoid service also and how many folks here have installed visual editor? Cool, lots of you, that's awesome. How many of you all have installed it on top of REST base and got REST base up and working as well? Really? Oh, come on, hang on, put those hands up high. Well, people are, you know, they tried, maybe sort of. Yeah, Gerard doesn't count, he works for the foundation. Okay, anybody who didn't work, doesn't work for the foundation that's actually gotten REST base installed and working correctly. Just about everybody put their hands up for visual editor, that's interesting and nobody for REST. So it's sort of hard to install, I tried to. Oh, I should have put my hand up for visual editor and not for REST base. Yeah, a lot of us tried. It's a little bit complicated, it works super great. It is what was definitely needed for Wikipedia to be able to edit and context switch quickly. The folks that wrote it, they know how to install it and it is amazing, but it's a little bit difficult for those of us in the third party community. I guess I was wearing my 2016 hat then to install. But it's great, but it adds a bit of complexity to the architecture. And then there's this mobile content service that was added also to interact with mobile front end and you've got your iOS apps and Android. So things get, as you start adding more capabilities, more features to the platform, the architecture sort of evolved in this way that, you know, media Wiki is great and it does amazing things and Wikipedia is awesome. But it keeps, it's sort of evolved and got a little bit, you know, an extra arm over here, an extra hand over there or whatever. And it's, you know, got a little bit unwieldy. Like an octopus. Exactly like an octopus. And then there's all this other stuff. Yeah, there, yeah. It's definitely an octopus. It's really super smart and awesome with lots of appendages, but a little bit hard to manage. So again, this thanks to Corey. He coined the term just in time architecture. The architecture for the current media Wiki was not so much designed, but it emerged. And it emerged. It's fabulous, but maybe it's time for us to sit back and revisit it and maybe clean off some of those rough edges. It's got some fragmentation. There's a maintenance burden that has been incurred by the evolution. Developer impedance, you know, it's difficult for the developers to figure out what all the different, you know, how to interact with the code and wrap their mind around the architecture as it exists now. A lot of technical debt has accumulated. And the documentation just isn't that good right now. There are parts of media Wiki documentation that are amazing. There are parts of it that are very well retained. There are also cases where the same thing is documented in multiple different places and multiple different ways, some of which are accurate and some of which not so much. So there needs to be some focus on the documentation. Sure. Where's the mic? James, you have one job. So the question is, do you have a roadmap for documentation? Yeah, I'll talk about that. Absolutely. So digging ourselves out of a hole again. Thank you to Corey. So back late last year and really, you know, becoming concrete in December was this thing called the audience's technology working group. So that's an interesting name. It makes a whole lot of stuff to folks, sense to folks in the foundation, maybe not so much to people outside. So the Wiki Media Foundation, two of the primary places where developers sit are a department called audiences, which used to be called product, I think, or something with product in the name. That's really, you know, sort of product facing. And then technology, which is more infrastructure and services. And that's where the Media Wiki platform team lives. So there were a lot of folks who were forward user facing, you know, concerned with features, concerned with things like the mobile apps, who lived in audiences, who wanted to have a simplification of this architecture. And then folks who are in technology as well, who are maintaining a lot of this infrastructure core code, that both sides had opinions about the need for there to be a simplification of the architecture, perhaps different opinions about the way the evolution of this should go. So the audiences department and the technology department formed a combined working group. And I thanked Corey Floyd a little bit earlier for those slides. He stepped forward to actually lead this working group. He's in audiences. I'm in technology, and I was working with him on part of the steering group for this working group. Starting in December, 18 folks from audiences, from technology, and Wikimedia Deutschland. Daniel Kinsler, who's there, has been very actively involved. He's also the lead of TechCom, which is a group that has oversight to large architectural changes within the MediaWiki platform. So Daniel Kinsler is very involved from Wikimedia Deutschland. We had folks contribute what they thought the main issues were with the current MediaWiki platform and ecosystem. And we collected 83 issues. We didn't initially deduplicate them or try to make sure that they had complete coverage of all the landscape, of all of the possible issues that exist. But through people identified what they thought were the most important 83 things to them. And then we went through a little bit of a decomposition of them with 28 dimensions, 28 different types of things that we tried to categorize them. For us to be as a working group to wrap our minds about what we thought that the most important things that we needed to address as we evolved the platform are. And as we were going through this process, we were all getting ready as a foundation to go through our annual planning process. And with the support of the CTO, the head of the technology department, Victoria Coleman, and the head of the audiences department, Toby Negrin, together we decided to create a cross departmental program for the annual plan that also includes community engagement. Am I getting that right? The community engagement department, which is a third department that also has input in the interactions with the community. And so we came up with this thing called the platform evolution cross departmental program or CDP that will take the platform forward. So just how we got to where we are. Victoria will talk on Friday morning a little bit about the Wikimedia strategy process. We have gone through the last year through a process of trying to come up with a strategy for the Wikimedia movement to say where we will be in the year 2030. And that movement came up with a few key goals, supporting knowledge as a service, supporting knowledge equity. And those are guiding principles towards the year 2030 for the Wikimedia movement. And from those principles, those also helped guide what we wanted to support in a software platform for the Wikimedia movement. In addition, going into this, there were these position papers that were written by the audiences department and the technology department saying what they thought the issues were and how we should go forward. And then every year the Wikimedia Foundation has a developer summit. And so we had in San Francisco a developer summit in January and so there were working meetings at the Dev Summit that also helped to inform the audience's technology work group, which created the platform evolution cross-departmental program. And that program going forward will work on the evolution of the media Wiki platform with the advice of TechCom. So to give you a brief view, so whenever I try and characterize a problem, solve a problem, I always think what's the best tool for this? What's the best tool to store all this knowledge and all these pieces of information that I'm accumulating in the course of trying to do my job? And of course the answer is always a Wiki, right? So I created a Wiki. In fact, I created a Wiki farm, wikifarm.wmflabs.org. And maybe over the course of the next couple of days, I'll show you a couple of the other wikis that I've put on there. I use it for doing project management for the projects that I'm currently trying to track within the foundation. I have another Wiki farm called Cindy.wmflabs.org that's my little playground, and I created a Wiki on there called Notes. That's my little engineering notebook that whenever there's stuff that I just want, oh, so he's going to go look for it now. So I created a Notes Wiki there that I just sort of jot things down because I can't remember anything anymore right unless it's in a Wiki. So I create these little wikis that have different aspects of the things I'm trying to work. So I created a Platform Evolution Wiki. Here it is. These diagrams are generated using Mermaid, the Mermaid extension from Lua. And inside of Lua, I do some semantic queries. This got generated when I brought that page up, it queried the semantic Wiki, and it built that, which I think is pretty cool. I can click on these things. So the main goal at a five-year, we did a three- to five-year plan for this cross-term mental program because the things that we're talking about doing it, they're not going to happen quickly, right? It's not going to be tomorrow, we're going to have this thing done. So we've got a one-year plan that we proposed in the annual plan, but also a three- to five-year, we started looking from the higher level, three- to five-year plan. So our overall goal is to empower the Wikimedia Foundation to accomplish its goals of knowledge equity and knowledge as a service. Remember I said that this Wikimedia strategy process for 2030, those are two of the key goals. By evolving and investing in our technology stack to improve its flexibility, maintainability, and sustainability. So this all goes towards having a roadmap. This is sort of a high level, 30,000-foot view of what our roadmap is. And there were three main thrusts in the three- to five-year goal. Outcome one, allow engineers more easily to scale, maintain, and test Wikimedia projects. So it's basically all about architecture. First of all, figuring out what the architecture is now, documenting that, and then planning how to evolve that towards the architecture that we want to have. Simplifying that stuff. And when I was that last picture from the past of where we are now with Parsoid over here, we've got two parsers. So simplifying that stuff, coming up with the architecture of the future. Outcome number two, cross all languages. That's one of the great powers of Media WikiWrite, its ability to support multiple languages. Accessibility and usability spectrums can access the functionality of Wikimedia projects across more interfaces, devices, and form factors. Into the future, of course, is handheld mobile devices. Sure, people will still use laptops, desktops, form factors. But to see how much of the functionality of Media Wiki can be accessed, the ability to edit on the fly, on a mobile device, for example. So being able to have that capability across a range of different types of devices. And then number three, engineers are more effectively onboarded using new improved documentation that is clear, complete, cohesive. Yes. And we are talking about stepping back and looking at what we can do to have a comprehensive documentation portal. And to have that documentation, there are multiple aspects of Media Wiki that need to be documented. Certainly, front end aspect, how to set it up, server, back end, deployment of the Media Wiki platform. APIs, what's the API interface so somebody could query from a remote platform. What is the architecture? Looking at it from a high-level detail. Coding standards. How would somebody, if somebody wanted to develop an extension, what are the entry points, the interfaces for that? So there are a lot of different aspect, tools. There are a lot of folks who have written, even like, so for Wikipedia, there are a lot of people who have written tools that can mine information from Wikipedia and its sister wikis, sister projects. And so, but folks have also written back end tools to scrape to query data from their third-party wikis as well. And many of those tools can be useful for different people. So having a portal where folks can, just like we on mediawiki.org, you can see all of the extensions to Media Wiki. There's also work on a comprehensive tool catalog for tools that can interface with Media Wiki. Did I cover all of the documentation things that you were thinking that are missing? Do you have the microphone still? I think it's turned off. No? Yeah, that's good. I was looking at some of the separation of the documentation for the intended audience. Yes. And one of the things that we need to do is to look at the above the platform documentation for Wikitext and exactly have that oriented for the people that are going to build the application. Then you've got platform engineering documentation that's a totally different sort of area. Absolutely. And I'm hoping that we'll get some sort of ontology for documentation that could start the structural process for it. And I'd be happy to collaborate with anybody because that is a very serious issue. Yeah. And to get that documentation aligned so we know what pasta functions are, what we know. Yes, absolutely. And to make somebody who comes into this documentation portal have a way of, with as few clicks as possible, very quickly be able to get to that segment of the documentation that's most relevant to what it is they're trying to find. And then to make sure, as I said, right now, there's a lot of duplication. Make sure that somebody can very quickly find the one authoritative source to answer the question that they need. So. Mm-hmm. Back to you. So the question was to understand the terms knowledge equity and knowledge as a service. And that may be discussed more by Victoria when she's talking about the strategy on Friday. But really briefly, knowledge as a service would be the ability to take the combined knowledge that's in a resource like, for example, Wikipedia and to be able to use it and query it in multiple different ways. So that service, that repository of information would not only be a encyclopedia that one would walk up to and browse and search for within that resource, but you could then access it as a service in multiple different ways to create even greater and more powerful uses of that open knowledge. So that would be knowledge as a service. Knowledge equity would be making sure that there are no barriers to different people of different abilities to access the data, whether that be network constraints or language constraints or geographic or political constraints that there be equity in the access to the knowledge that's contained in these repositories. So. Okay, so that's the three to five year plan. The plan that we came up for next year is very similar. And so outcome one engineers have clear understanding of the technology stack. So again, that's, you know, the architecture being able to define where we are and how we plan to move forward with that. And outcome number two engineers are able to access more functionality using encapsulated components and well defined API's. So revisiting the API's that are available and making sure that they are available in the ways that they need to be consumed. And the third understanding the current architecture and coding standards and where to find them. And that's the documentation aspect. And just briefly, so you can see that I'm not making this up. If you click on any of those boxes, it'll take you to the drill down on so this is the drill down on outcome one and there are outputs associated with them. And each of those themselves are a page in the wiki. And so that top level view, as I said is all queried. And so I really feel strongly about the use of wikis and their power and utility and write something once and query it in many ways and display it in many ways. And I'm trying to evangelize that at every opportunity. So how do I get back to my presentation? Where did my presentation go? So I talked about the three to five year outcomes. And just to simplify and summarize those easy to scale develop maintain test all features, all platforms, all devices, all form factors and documented all those are the high level three to five year goals. And in our first year, the goals are clear understanding of our stack and what the plan is to evolve it in the future. Starting to modularize selectively across the current code base. Obviously, we're not going to throw things out as they exist now and say, oh, we're just going to give you a completely different set of code that you're going to suddenly have to big bang get everything working with. No, things are going to evolve in place. So selective modularization, that's not to say that the interfaces won't change that folks, you know, folks will have to adapt as this grows. But the idea would be to give a well defined path and good warning of how things will change in advance and then a centralized documentation portal. Quick question for you. Regression testing. Regression testing. What are you looking at? Because we have this issue of something comes out. Are all the present features supported or something broken? Or are we in an emergency situation to notify somebody? Yeah. So, well, I guess there's several different ways to answer that. So certainly, there are just have them hold on to the microphone. Testing. There are, there is testing built in to the media wiki code and increasingly more testing with time. It is a goal. So, the goal would be as there are changes introduced, making sure that things don't break or if they break, they break in a well defined way and folks are available or aware of that. But also there is a plan in place for and there is now and will continue to be a deprecation policy where things are advanced are announced in advance. If something is going to go away, he can hardly hold himself together. Give me one second and then I'll let you talk. Yes. So, there's a deprecation policy. So, you are aware in advance in multiple versions and so you are aware when you upgrade, first things will become deprecated for a while before they eventually go away and you are going to say. What is the communication policy for these deprecation things? I appreciate that, you know, you recognize the need for, you know, no sudden changes. But at the same time, if you deprecated something and that's not communicated clearly to an end user, deprecation can look just like yanking it, yanking the rug out. So, how are you going to communicate the deprecation to the users? It's going to have to be communicated in multiple channels and we will certainly for the third party community, if there are ways that will help to communicate that information better so that more people have access to that information, we are always open to that input and feedback. But there are, everybody knows, there's the Media Wiki mailing list. There are, you know, if there's something a major architectural change, it's going to go through an RFC process with TechCom and be announced. So, there are already existing channels in place. If there are things that you feel that you're, there's an answer he wanted. Yes, absolutely. We haven't mentioned the Media Wiki stakeholders group enough today, have we? The Media Wiki stakeholders group is a joint group that was created from a number of members of the Media Wiki third party community as well as folks who are at the foundation and since there has been transition of some of us original members of the stakeholder group from the third party community to the foundation, increasingly, there are channels of communication from the stakeholders group to the foundation. Yes, the Media Wiki stakeholders group is a group that meets online monthly. There is a page on mediawiki.org. There is a fabricator. There's a website, mwstake.org. Chris Corner, are you or what? Oh, Chris Corner is going to talk about it in the very next presentation. So, I can stop talking about it because I'm probably running out of time, right? Yes, it's a great group and it is a great channel of communication. If you guys are not currently tied into the Media Wiki stakeholders group, please, we welcome your participation and I'd love to see even more people every month at our online meetings. But Mark Hirschberger is our fearless leader and sometimes we forget to announce things quite well enough. Please forgive us. But yes, it is a good channel of communication between the third party community and the foundation. So, speaking of third party users in the stakeholder group, can you speak a little bit about how your work with this roadmap is tied to and restricted to the Wikipedia approach of Media Wiki use versus how other people like to implement it that doesn't necessarily parallel that use case? So, one of the things that was mentioned and multiple times by multiple people in this audience's technology working group through the course of this work is the need to consider the third party use case and to make sure that we do not in evolving the platform do anything that negates the ability of third party users to continue to use Media Wiki. So, it is very much on the minds of those who are evolving the platform. But there are questions, there are constraints in the evolution of the platform where, I'll give you an example, managed hosting, the ability of somebody to go out to a host, somebody who will install Media Wiki, a version of Media Wiki for you and maybe some extensions, but you don't have command line access, for example, they're managing it for you. There are limitations to the things that you can do without command line access currently. How many folks here actually operate it? I'm talking, aren't I? A lot. I'm behind time already? Really? I had until, didn't I? I'm way over? Okay, I'll wrap it up then. I'm just taking out Chris's time. Okay. Did I just blow right past 30 minutes and into close to 60? Wow, because I looked at the clock a few minutes ago and I'm like, hey, I've only been talking for 20 minutes. It feels like way longer than that. Okay, I'm sorry. Maybe I can, at a lightning talk, go into this a little bit more. But before, I was just going to get people to raise their hands. How many folks here work in that managed hosting environment where some other organization is responsible for their media wiki installation and they have no access. So you, okay, one. Okay, I'd be interested to, pardon me? Okay. So yeah, that's, and I'd be interested to talk to both of you more later. But that's one use case that's very often comes up as, well, we can't do anything that'll break managed hosting. And very seldom do I come across somebody who is in that use case. So I want to get more feedback from that and figure out what the real constraints are there to make sure that we aren't unnecessarily constraining ourselves, but we continue to support that use case as we can. Maybe there are alternative solutions that we can come up with that would support you just as well. Or maybe we need to continue on having that as a constraint. So yeah, at any rate, feedback from y'all on how to what really are and are not constraints on our system. And there are some other programs that we are also, so the big CDP, the big platform LV illusion thing is the big thing that we're planning for the next three to five years starting next year. There are other programs as well that we are working as a media wiki platform team have an involvement in next year. And those are some of them. I can talk about them later if anybody has questions. I think that's it. I'm done. Chris? Sorry, Chris. Cindy? So how long do we have before we have to leave for dinner or that we were planning to break? Okay. So I think Peter has a question behind it. I have the microphone. Yeah. He has the microphone. So I'm sorry. He trumps you. So if you would have to answer Cindy in 2016 asking, how can I guarantee my customers that this wonderful platform is still alive in a couple of years? What would your answer be? And what would you do to convince her? So I'm having a conversation with myself two years ago. Is that what I'm doing? Okay. I would convince her that the Wikimedia Foundation as a steward of the media wiki code base in collaboration with the volunteer developers cares very much about third-party use cases. Obviously Wikipedia is customer number one and we not want to make sure that Wikipedia continues to be supported. But there are those in the foundation who care very deeply about media wiki as it is used by third-party developers and will continue to make sure that third-party users are supported and that nothing will happen within the platform that would make it impossible for them to continue to use it. So I would try and convince myself that I can sleep well at night that there are those who will continue to make their interests hurt. Mark. As the program chair, I just want to say that we can go eat whenever we want and if that's later then we can eat later. As long as no one's stomach starts growling too loud, if I hear a stomach growling then I'll cut Chris off. Cool. And I think that's a good, I think you should go today. Yeah, I think it's a good segue into your talk. So cool. Thank you all.