 Okay, we're done. So let's welcome Bernard Gabor. Bernard, can you unmute? Yes. Am I pronouncing your name correctly, by the way? Yeah, it's Bernard Gabor. Okay. Right, so Bernard is working for Bloomberg and he's gonna talk about virtual ENF, rewriting and re-releasing it. So please start your screen sharing and then we can move ahead. Thank you. Hey, hello everyone. Hopefully you can see my screen share at this point. Yes, it's working fine. Okay, so let's start. Go ahead. So hello, I am Bernard Gabor. I'm gonna be talking to you today about rewriting and releasing virtual ENF and this will be basically what did I learn by trying to do this ambitious project? So the first question we need to answer is what is virtual ENF? And for those of you who might not know it yet, virtual ENF is a tool that allows you to create Python virtual environments, okay? And what is a Python virtual environment? It's basically a Python executable that behaves as it would be a separate Python installation separate from your system one that is. So what does this mean? It means that packages installed into this virtual environment Python will not affect the system Python installation. For example, if I create a virtual environment, you can see the CLI comment for it there. And I basically install a package into it. I can do this even if I don't have administrative rights to install packages. And the tool will be happily invocable from within the scope of that virtual environment. And everything works as just as it would have been installed into the system Python otherwise. So the really important thing about the virtual environment is that other than the isolation that it provides, it behaves exactly in almost everything just as the system Python would behave. For example, you can see here if I printed out the information about the Python environment and I create a virtual environment of it. What I get back is the version information and is the same down to the last byte, okay? So virtual environment is fairly popular project has been around for the last 13 years, but just to quantify how popular is I use this pipeline info powered query to find out just how many downloads did the project had in the last year or so. And basically did for the virtual project this meant around point eight terabytes for every 12 months. So if I plot the results that I got back is basically this graph, you can see that it has a daily basis around 300,000 downloads and per week, per month, and which basically sums up to around 120 million installations per year. This is installation from the pipeline index server. It doesn't necessarily include installations from mirrors and whatnot. But the important thing is this is that it'd be at the moment around the 66 most popular Python project. So as you can figure it has, it is very widely used. So doing any kind of changes to the project is always kind of problematic. Now, one thing that this talk is not going to be about is what are virtual environments in depth, meaning how do the virtual environments actually work. If you want to get a glimpse into that, how do virtual environments actually work and how Python itself achieves creating isolated virtual environments. I invite you to see my talk from last year's Europe icon, which I linked in here, and there you'll find all the necessary information that you might want about that. But focusing on this today's talk is going to be mainly about the virtual project rewrite. And before we go ahead, I would just like to clear up one thing is that when we talk about virtual environment creation tools, it's not just virtual that is around. We also have the VM module, which is part of the standard library as of Python 3.3 and it's defined per that 405 And you can use both of these tools to create virtual environments. So just to clear up things, what is the difference between the two projects? Well, the difference between the two projects is that besides just virtual and being created 10 years before VM is that the virtual is a third party package. And this means that you will actually need to install it and manage the installation separately of your Python installation. Well, with the VM package, it comes with your Python installation. You have to do nothing just install Python and you have it already available. And this is also an advantage as it is advantage, depending from where you look at it. The positive is that in case of virtual and project, if you discover a bug within the project, getting that fix can be a very quick turnaround. You can fill a bug, a pull request, and maybe in a day or two or later, you already can have a release that contains the fix. While in case of the VM project, if you find a bug, you may well need to wait months, in best case, but even years in the worst case, just to get that bug fixed in because you need to wait the entire Python release cycle to get the fix and not just the CPython release cycle, but also your operating systems release cycle. So, another big difference is that VirtualM also supports Python 2.7, while VM also supports Python 3.3. So if you need to create virtual environments with Python 2.7, your only bet is to use VirtualM. And VirtualM itself supports PyPy and CPython, so in case of any other interpreters will not be able or flavors will not be able to create virtual environments. However, this is not the case with the VM projects, because given that you have an interpreter implantation which satisfies version 3.3 or later requirements, it should already be supporting the VM project. Now, the first four bullet points I've enlisted here are basically things that were true before the rewrite. As I just talked to you the two last year, there has been a VirtualM rewrite and while we did this rewrite and release, VirtualM got a few additional differences, a few additional benefits that are true today, but were not true a year ago. One of the things is that the VirtualM project is much more configurable, meaning that you can configure not just the common line argument, you can configure the tool not just to common line argument, but you can also configure it to our environment variables or user configuration files. And you can use these configuration files, for example, even to alter the default values for VirtualM, giving you a lot of power. For example, if a broken PIP gets released, you can easily use this configuration file to pin to an older PIP version. Similarly, the same thing cannot be said about the VM project. Another thing is that the VirtualM project is extensible, it has a plugin system, meaning that you can support additional functionality to this plugin system functionality such as supporting activation script for additional spells, supporting additional Python flavors. For example, if you want IronPython virtual environment creation, you can write your plugin for VirtualM which allows you to create IronPython virtual environments or Rust Python or whatever else there might be. And the VM is also extensible through by extending the base class. This is a bit tricky because if you extend the base class, you do not necessarily get the same common line interface, you have to redefine the common line interface again. Now, and finally, probably the biggest or most impactful difference from day to day usage for the average user is going to be performance. The VirtualM allows you to create a virtual environment in less than half a second while with the VM, this usually takes two seconds or more. And this is in the happy case that you're running on a Linux operating system where this operation is faster, multiply this by a factor or two or so on the windows. Okay. And a final additional benefit of VirtualM is that the VirtualM actually has a rich API, means that if you use it from Python to create a virtual environment, you get back an actual Python object on which you can query information such as, hey, what is the Python executable created, what is the location of that, what is the location of the site packages folder, or what is the Python interpreter version and that kind of information is very much available on this result object. And this is something that can be very important if you're writing a tool that wraps around VirtualM environment creation. Okay. So, just to sort of check, now that we have VM at parts of the standard library, do we still need VirtualM? And my answer to that is I do think we need it. The way I see it VirtualM is that VirtualM is basically the place where we can innovate and improve the virtual environment creation ecosystem. And we can try out things, basically check things if they make sense and in case they do make sense, we can talk about then porting them to the VM project and then getting it out of the box. However, the VRTLM project on itself provides additional benefits of like better performance, better API, better extensibility. And these are kind of like bullet points that are very important if you're writing tools around Python environment management. So for those projects, VirtualM will be very much to go to project. Okay. So now let's focus on actually the rewrite itself. So the first thing that we need to cover is why we decide to rewrite. And just a small caveat, if you're wondering the dog in the picture is my dog. So, smaller your shy terrier called Silky. So, let's move on. So the one of the biggest reason why we decided that we needed a rewrite of the project is a lot of old pains. And these pains were basically that the virtualM project, as it was a year ago, was basically a single fried project which has spurious ifs as branches but spread around a thousand lines of code. And to make things worse, the test rate only covered around 60% of the line coverage and we're not even talking at this point about branch coverage. So you can imagine from this that maintaining the project was very cumbersome. Basically, I think any kind of change to the project meant that you are not really sure if that change is not going to break the world. And it was kind of like a lot of iterative process that we fixed around waited a few weeks if no one responded, no one submitted any kind of like issues that hey, you broke me, then we're like, I was a safe change, but we have no certainty around this. And also had a rudimentary plugin system. In theory, it had made it possible to extend the virtualM project, but the way it did is basically a pure man's plugin system, meaning that it had various comments within the code. You can basically inject your own code into that code sections. This made it both that it had a plugin system, but it was very hard to use. So very little people actually used it. And this meant that in practice when other interpreters wanted to add any kind of functionality or people, they tended to add this functionality straight into the core project, which made it even harder to maintain. Because especially given that the low confidence value we had in actually that things are not going to break once we do add some new lines to the code base. And finally, another thing was that there was a lot of things that a lot of things were designed 12 years ago or so, meaning that a lot of the design choices within the code base were based on the fact what Python offered and allow 12 years ago. Python got a lot better and smarter throughout the last 12 years. So now we could actually have much cleaner, much nicer solutions for the lot of the problems in there. So, let me say the writing virtualM wasn't something that we just came up last year. It actually was something that's been around at least from 2014 or so. And there has been at least two attempts, both of them don't have a true. And before we jump into trying to do our own rewrite, it's important that we understand why those project fields. And looking at them, I think they may need faith because the virtual environment creation in essence is deceptively simple. And the important thing is deceptively because if you actually go into the depth and start doing it. There are a lot of nuances and edge cases which are going to make your life really difficult and bitter. And it's also very hard to test because you're basically need to need to support all possible platforms and environments. So it's very hard to even set up a test environment, just to make sure that all the Python versions work correctly, work nicely. And it just practically very hard to do this. So, doing this, I jumped into the project myself basically by around 2018, but before that I was a tox maintainer since 2017. And those of you might know that tox is basically a test suite runner, meaning that it manages setting up and running your tests. And part of the setting up your test suite is also creating a virtual environment, installing your project into that and then running the test suite. And a lot of the bugs, which bug reports that we got in talks for related to the virtual amp because the talks to researching the virtual amp. And because of this, I started to interact more and more with the virtual amp code base to the point where in 2018 I actually became a maintainer of it. At this point the project was basically very much in an abandoned status. And in 2019 I got to the point where I wanted to attempt the rewrite or stocks itself. And the reason for this is that talks, one of the major concern that people were reporting to talks is that, hey, talks is very slow. So I started to investigate why is toxic. So what I found out is it is that a lot of the slowness of talks actually comes from virtually, virtually environment. And that is a virtual amp project itself, because the creation of the virtual environment is slow, the API like stability provide information we need to provide to the test suite runner. So we need to make additional sub process cross the take extra time, or that the interpreter discovery happens both in talks and virtually. So, at this point I was like, okay, it seems like to make talks faster I need to make virtually faster and better. So I pause my rewrite on the talks and I started doing the virtual and rewrite. So, even the two people at least already feel that this project, I set up a plan, and the plan basically went around. First, identify what are the components within a virtual amp project what are the sub components the sub segments that can be separated from within the comb code base and came to the conclusion that there are essentially four different kind of main sections within the virtual amp project. And the first section is the creator creator is something that, given a system Python, provide you a virtual environment of that Python. Now this Python is very raw doesn't have any packages or something like that is basically just a Python environment that's isolated from the system Python. Now the next thing is the seed mechanism, having a virtual environment without being able to install packages into it is often not that useful. So it is very important that once you create a virtual environment we created with the PIP setup tools packages within it, so that users can actually start using that virtual environment and install additional packages into them. And this kind of like process is what we call seed mechanism is we see some packages into the virtual environment creation so that users can then install additional packages into the project. And there's a third component which is activators, a virtual environment is already usable if you just basically passing the Python executable, like if you type in and been Python, it's going to work magically, but users tends to use it more from the CLA. And in the CLA it's easier just to type in Python rather than the full pet your virtual environment. And for this reason we're using activator skips, the activator scripts are small scripts, which are specific to your shell you're using, and basically alter your shell that when you type in Python, instead of invoking the system Python is going to invoke the virtual environment rather than the system Python. But it's even more generic that is going to also, for example, when you type in PIP, it will also invoke the PIP from within the virtual environment rather than the system Python. Okay. And this is the activators part and finally the interpreter discovery and the interpreter discovery is important, mainly for supporting cross Python environment creation, meaning that as we say before in order to use virtual project you have to install it. And in case you want to create a virtual environment, in case your Python system has two Python, for example, Python two and Python three, you want to create a virtual environment with both two and three, it's helpful to be able to meet to install it just in one, for example, you can just install the virtual and project in Python three, but still be able to create a virtual environment in Python two, even two, even though you're not actually have virtually installing to that Python system Python. And it's even more specific, it even allows you to, for example, to be very broad about what Python you require, for example, you can specify as you see here that hey I want Python 3.6.9 slash 64 bits, and the virtual project will look out on your machine find if there's any interpreter available on that platform that satisfies that requirement and if it is, it's going to be able to create a virtual environment of that. So this is again just a quality of life improvement that you don't have to basically memorize where are your virtual environments or where are your Python installed on your machine. And it still allows you to create them just by specifying a few by typing in a few Python specifier string characters. The first thing when starting to do a rewriteable project is to actually decide what we you're going to continue to support. And in case of the virtual and what we decide to continue to support is basically we wanted to support this kind of cross version creation that I just described in the previous slide. We also wanted to keep supporting Python 2 for once because we want to support end-of-life pythons a few years post their actual end-of-life just to ease the transition of these pythons for people because virtual is going to be the last project they're going to be able to drop. Similarly, we also support PyPy and until PyPy supports Python 2, we still need to support Python 2. We also want to support annoying install use and this is something I'll get into a bit later, but basically you should not need necessarily be able to install a virtual and to be able to use it you should be able to just download and run a virtual end-of-life package or kind of like an already defect, okay? So then we decided the next question is what we want to add and what we wanted to add as a new functionality was extensibility. This is the plugin system so that additional Python interpretation can provide their virtual end implementation outside of the core virtual end project. We also wanted to provide the rich API so that tools that wants to wrap around virtual end can get information about the created virtual environment very easily. We also wanted to unify the virtual environment creation sex mechanism and this is something that a lot of you might not know, but actually before the rewrite, the old virtual end created virtual environments in one way by the VM uses a different way to create a virtual environment. And the reason for these two discrepancies is because virtual end was created back in the days before VM. So it had to kind of like monkey patch Python to be able to support virtual environments, but with VM, the virtual VM project, the CPython interpreter itself added a lot of mechanism that makes it a lot easier to create a virtual environment. Then also decided what we're not going to support and whenever you decide something you're not going to support, it's very important to provide alternative paths to the users. For example, we decided that we're not going to support relocatable virtual environments and the reason why we did not support this is because getting relocatable virtual environments correct is very hard in cross platform support like if you're targeting all platforms. And this is something that we tried to do, we kind of like supported for or in experimental status supported between the virtual project for a few years, but we always got a lot of bug reports. So with the rewrite, we decided to let people say, hey, try to provide this functionality as a plugin and if it is a plugin, you can actually scope it so that it's very, very narrowly scoped and it only supports your platform and there you can ensure that actually things are going to work. Another thing that we decided to drop is drop all the deprecated flags, because the virtual project is a lot of the project that we introduced like 10 years ago, we wanted to drop it. And actually want to do this and especially after the rewrite people started creating issues on our environment and our issue tracker that hey, why did you drop this it just disappeared. And we just had to point to my that hey, this is something that was deprecated a few years, please just remove the flag everything should work just as fine as it did before. Or basically we also wanted to drop the having the entire project in one file and the reason for this is was that we're actually get to a state where workshop is easier to maintain and reason about it and test. So, and also we wanted to stop as to the only dependencies we actually want to pull in Python environment, Python ecosystem dependencies. So we don't have to rewrite and maintain a lot of functionality within virtual and functionality that's already available as a pipeline package. So, in order to want we had kind of like the rough plan what we want to support what we don't want to support, and how we're going to go ahead, we posted in our session our GitHub. And we let people get back to it it was generation. It was in general positive received with a few additional changes so suggested on it to the point where I was like hey, we're ready to do this and I, as you can see here I basically was very optimistic was a give me a month and I'll be ready with this. Fast forward six months later, it's six months later basically I got to the point where I was ready to have a first review request. This is when I basically posted information to some Python package authority maintainers and hey look at the code base. Let me know if this makes sense and it's kind of like works. I got back some review on the review I got back I made some few iterations and on the January 21 we got the first beta. The first beta allowed user to actually test it themselves on a machine without needing to install it from like a bit repository. And this got a few more feature request bug request suggestions we apply that to raise the second beta to the users. So the users can again test it. And after the second beta we did not get much more feedback and part of it is that people who are interested in actually being going to pitch work kind of like cover their use cases. So on January 10 we basically went on the first public release. The same thing. Just to speak a bit about the people who made this possible for the first release, it was basically sit down, Kumar helped migrating the activation script from the old workshop to the new virtually but otherwise everything else was done by me. Before the project got public people started to look at the code base and made a lot of suggestions and improvements and poor request. And this is kind of like the list of people contributed to the rewrite of the project with this with the number of lines of change through the project. And then before we actually did the release it was very important that we publicize that is going to happen this release so that people can be prepared for it and we did it on the Monday so that people actually have time to address it. And firstly made a Python discourse package in the packaging section forum post where we explained in detail that why did we rewrite what are the new features. Then we send this down onto the distributed mail list, followed by the action release happening to the pipeline package that this point people could actually try it. And then in a version constraint or version pain, virtually started to get it. So the issues and feedback started to pour in another of the things that we did is, I made a Twitter post about this which got fairly popular and actually got a lot of impressions. So people could know about this through Twitter too. In the next section I would like to describe a few technical good chats. These are a few things that we learned by doing the rewrite things that we got us a bit unexpected, just to eliminate a few things which you might run into when you try to do a rewrite. One of the things that we should always do when you do a rewrite is consider people stuck at work versions. Some version people could actually not use the new version, not because they did not want to is because their platform doesn't allow you. So always provide the way them to support them. And the way we did it is keep the code that we had on a legacy branch and this legacy branch keeps getting. This also is released as a documentation as a new tag, but also has a new change log at the end of it points to the legacy change log. This was suggested by someone, so that someone looking at the new change log can see how the new change log transitions to the old one. So that's it for life cycle of the project and keep releasing community patches even after the release. You can see that two weeks later we still the release, which basically fixed a few compared to the issues with the new virtual and so the another lesson there was that the CPI interpreter is very diverse and what it means by it's very diverse is that when you say what is a Python installation and you look at your operating system like the pet structure, you are used to a given structure but this structure is very diverse and actually varies between platforms, because every platform that releases CPI tone can customize it. This is for example, what is the configuration script on my platform. You can see that, for example, that the pure leap is on there the leap folder but the platform is on their lips 64. So the bottom line is that if you get an installation of feather and you won't do they might look have a totally different file structure, just because the when those distribution decided to release with the Python, they decided to use a different folder layout. So this is one of the main things that made it very hard to use to support the virtual amp in the older basis, because we did not basically use this information the configuration information available under the system package but instead try to do if as branches to guess what is correct for the current platform, we switched this to the system thing one which made it that now we have much better support for various redistributions of the CPI tone. And for example on the side of Fedora, you can see that the leap 64 and the leap has separate side packages folder. And this is in line with what is the expectation on those platforms for a virtual environment. And one of the things that is that the linear distributions that also does other recent conservation, for example, Debbie and does not install the end by default. And there goes our assumptions from earlier that we have comes out of the box, it comes out of the box unless you're on a Debbie and distribution such as you want to. And secondly, is that the peep uses this to tell us past installed. And this is basically the same configuration I showed you on the earlier side, but is a separate, it's basically duplicated, duplicating this config. And for example, the Debbie and patching of this. And this is actually for the district land assist config configuration map. And so we had to pick one of the two which one to use and we went with the district as one, mainly because peep uses it. And because peep install users install packages into the virtual environment using peep. That's what it's what people would actually expect. Okay. Another kind of catchable that long term support distributions, for example, sent to us ships with peep later or the oldest long term support sent to us distribution shoes with peep nine, so that it does not. We do not understand Python requires on the list. So basically struggles to install our Python way or our Python veil of the project. And what we had to do to work around this is we had to like, not use the latest available six within the project, and instead vendor this into our project, and provide additional better message in the set of the pie so that people trying to install virtually I want to send to us can get a more useful error message, and rather than something very quickly, which leaves them baffled. I feel like a catchable that Mako, I told you about the ability of the users that hey, the operating system that they can customize the layout of a Python installation, even more, for example, macOS has even more Python installation and they customize even further, you can get a Python from the Python or corp I am, but you can have a Python on macOS from the part of the operating system, you have a blue installation from where users may be able to get a Python. You also have the Xcode developer tools, both for Python two and Python three from when users may get a Python, and for all Apple shipped by tons, they are harder to static pack this is kind of like for security and safety reasons. This means that basically if you take a Python executable provided by the macOS system, you can just copy it and use it again, we have to mangle the binary, so that it's things that it's in the correct location it allows to run. And this kind of like operation is basically things that complicate achieving the rewrite, because getting to support all this platform is that you have to have custom code to support every little this kind of distribution. These are actually not shipped by Apple, but the fact that some user somewhere my hi might have available on their machine means that they're going to create an issue, so you'll have to support that some level. So here kind of like the issue of that the fixing this to tears, some of you might not know and this might go away once this to tears moves out to the standard library, but this to tears actually allow you to customize your actual prefixes for example you can customize where your binary should be placed into like a static path and this is done through some configuration files. And the problem is that if user set this configuration files or even your operating system ships with this configuration file. So we basically had to monkey patch kind of like this this package so that it ignores this configuration files, because otherwise what you ended up with was that you install the package and it installed into the system environment rather than the virtual environment, just by the fact that the distributors could pull in in a virtual environment to the global configuration of it. Okay. And yeah, we this was very hard to track that I'm done it entirely we had to close, close and reopen issue five times to just be tenacious about fixing box you'll get there eventually will be prepared to be frustrated while trying to actually do them. Another kind of like issue we're into that she's similar in the same scope is like the windows store Python, some of you might know that windows now actually ships the Python interpreter part of the windows store. It is very convenient for the user to install, however, it needs to satisfy the restriction of the windows store Python, and these kind of restrictions are basically that you can. You can run this patch but you cannot read it, and this can end up in surprising behavior and this is something that can tickle off your. Put you off of events like expectation, for example, if you expect that the six executable always exist you may end up in some cases does not exist. In this case, you have to make the decision do you want to support this platform or you just want to play a meaningful message for the user and increase them to upgrade to a newer version that maybe has this kind of functionality fix. Another kind of thing lesson was learned that the Python project is maturing, they're very quick to answer, however, the, a lot of the platforms, for example, macOS and windows are not well tested. So while we try to do the implementation we try to keep the functionality on par with CPyton was trying to keep the function par with the CPyton discovered a lot of bugs in their VM project reported them and most of them has been actually fixed by now. Another kind of like a Scotch is that on Python to on Python to actually create a Python to to create a virtual environment, you have to use the OS dot pipe file and oddly enough. This pie might actually be missing this is a landmark right to see fighting to protect to detect where is the operating the center library actually exist where is the prefix of the Python installation and this. There's a valid reason for this pie to exist and for once it can actually. Python can function without it and for second is that you if you remove all the Python file and you just keep the compile file, you get obfuscated sort code and maximize your storage availability or storage. For example, this can be useful on a Docker image or something, and you should be able to still support creating virtual environments, even when you don't have the Python file available. To only fall back to only having the way I see the compile file. So basically, having a Python interpreter with just this and just avoiding the weather pipe. Another kind of thing that was into it is that the cedars and the cedars is basically. I decided to install packages and the old system what happened that we basically took the Python or the peep wheel and put it on to the Python pipe and that as that peep to install itself. Now this is slow because people the general purpose install tool, some of the avoidable or that are that real validation startup time self a big chat. This is the kind of functionalities that you don't actually need in case of this use case because we shift the wheel ourselves and we know it's correct everything's going to be fine. So what we did is basically added now when you see the functionality called the app data and what this update does is basically allows you to cache as much as operations are possible so that subsequent subsequent virtual environment creations have to do as little work as possible. For example, we put everything that is cashable within the app there's platform specific application data location and we cash both the real validation the extraction of the wheel generating the past file, fixing the records basically doing most of the installation that's avoidable, we cash it into the app data, so that when we actually have to do a subsequent virtual environment creation. We don't have to do this all this step but instead we just have to copy or seem link to the purely folder of the new virtual environment and generate the console screens. This means that instead of having to spend like for three, two seconds to do a provide a seed inform a seed package, you can don't be done in literally like 100 milliseconds if you seem links and half a second if you use like actual a virtual if you use copy method. So very functional it was how should the packages be updating and you have two options either you're going to be always keeping it to the latest best or you're going to be keeping the version that you shipped virtually with it. And this is quite divisive because depends on what you prefer do you prefer convenience and ecosystem evaluation or you defer speed and stability for your project and what we ended up doing after long discussion is that we basically on the middle ground where we use the last 20 or 20 or this older real only so we auto-upgrade but we only use it if it's new enough. This means that the projects have a great period when they can actually fix any box that they might have. Okay, now one of the things that we all did not expect and it's kind of like a gotcha is that it could be that the user application that is not writable so we need to be able to fall back in this case to something same to still not fail. And another kind of like things less ensure that be friendly if it's your redistributors basically we identified the path that we needed to what power the parts that our readers wanted to actually patch to alter and we made it simple and easy for them to do this by understanding why they needed make it easy for them and documented there and what should be patched when they're going to alter the functionality. And here's where I talk about the download only mode and the download only mode is basically something that's going to similar functionality as you can install people. We just downloading and then the invoking Python, you can also just basically download the zip up that we created and then just install that zip up. I'll just invoke that zip up by passing it to your Python environment. Okay, now I'm going to skip this because I don't know this. So yeah, so one of the additional function is that we had to we added now that we actually can add additional functionality on top of the virtual integration. So, for example, we could fix upstream box before they actually made it into fixing the upstream. In this case, flipping the pie and launcher on macOS, or for example, automatically getting your file into the virtual environments to stop them being committed to your version control into your GitHub. Okay, and be prepared to do releases but whenever you do release, make sure that you're around in the next hour or two, so that if anything major breaks you can easily and quickly fix it. So, to conclude everything up, what we learned is that when you try to do a rewrite, always have a validifying delivery plan, test a lot, automate a lot of your CI's, and prepare for a lot of box fixes on short turnaround. We basically had 27 releases in five months to the point where now works can be basically stable. We don't get any more box fixes, but in the first two months, there was quite a lot of work involved in it. So, be ready to re-evaluate things in your project and see if you can do it better today if you would actually do it all over. Okay, but in general, be nice. And whenever someone does a rewrite, try to approach it in a very understanding fashion so that you don't get to the section where people are basically frustrated when they try to do something good. And just because you get something that you get, they do something that you don't like, you're getting nasty and distinguishing them from contributing to the open source Python ecosystem. Okay. And just to say, virtual and we're still going very well we got a bit of bump, but otherwise we're in a good state. And that was all for my talk. I'm not sure if I still have question and time left, but if not just drop chat questions into the chat. Hey, that was very, very intense talk. Thank you very much, Bernard. So we have three questions I think virtual and requires the Python version you want to be created in already. Yes, it does require and select an arbitrary Python version and have virtual and download and install it. I can't hear you more kids you're muted or something. But, but to answer that question, because I understand most of the question is that it's true, you can't have basically arbitrary Python installation. Nothing stops you, for example, to writing a version plugin that actually will download it's not available. And this is possible. For example, you could write a pie and virtual and plugin that actually uses pym to download and install a Python version if it's not available on your machine. I can't hear you mark. Not sure if that's my side or your side. Okay, I think I'll just pick the questions into the query on on the zoom and answer those. So the next question is, was this work done on your personal time or your work time. If the later how did you argue for business to be a locator for such fundamental work with little business games. So it was done both during my personal and work time. It was basically, let's say, there is kind of like something as 20% time or whatnot. I did part of my work time during this my 10% time, and it was considered basically as my personal development. That's how the business signed off on it for me to work on it. And the third question that we have is basically with a new version of virtual and being great in the standard library in the future. I don't think it will be included as one and one, but I think what's going to happen is that parts of it that turns out to work a lot better in virtual amp will be back up ported to VM. Thank you very much.