 Hello, everybody, and thanks for coming. So today, I wanted to talk about automating very tedious and error-prone things in releasing new PyPI packages. So some background first. I've been contributing to Open Source for 21 years now. And over the years, I've accumulated a number of projects that I'm either the author or maintainer of. And during that time, I've learned by a hard way what can go wrong in the release process. If I can just show you. I'm sure there are people with far more projects that they maintain. But even a subset of this, if they are very active, can take this toll. So the problem with releasing new stuff is when you do packaging, a lot of things can go wrong. Like, for example, you might miss some files in manifest or you might forget to tag your release in Git or all sorts of things can go wrong. And they are really hard to fix afterwards. But even then, you would really like to automate the process in order to speed it up. Because if the maintainer feels that the release process is troublesome, then the maintainer is also far less likely to make frequent releases. So when I was struggling with these problems, I set out to figure out what I can do to improve the set of things. So I came up with a process that involves a few select tools. And the last piece of the puzzle actually became available last year. I'll talk about it later. But if you have ever released anything, you may remember that you need to add the minimum. You need to run PyPI, pythonsetup.py, sdist, bidist, wheel, and then twine upload. But you also should tag your releases and you should amend your change logs and whatnot. So some of that can be automated. And I will show you what I have done to do that. So the process goes as follows. First, you make a tag. You push it to GitHub or wherever. And then GitHub notifies Travis that there is a new tag. And then Travis ties the build process. And assuming everything goes well, Travis then automatically uploads the packages to PyPI. So how do we make this happen? The first piece of the puzzle is set up tools SCM. Are you retelling my system process? What? It looks like you are retelling my process because I do this. Right. I just check. Well, it's good to hear. It kind of validates it. So what set up tools SCM does is actually two things. First of all, it eliminates the need for manifest.in. If you don't happen to know, manifest.in is a file required by SetupTools which tells which files to package into the source distribution. Well, with SetupTools SCM, you don't really need that anymore because it just takes a look at what you have in the repository and then just packages the same files in there. So that eliminates one vector of errors. And the second one is automatic versioning based on the git tags. So this way, you will have to add the git tag every time you make a new release. And the version number is only in one location, which is in the git repository. If for any reason you want to have this Dunder version variable, there is a way to do that. SetupTools SCM has instructions in its documentation for that. The bonus feature is also that if you make additional commits, then it automatically increments the version, which is configurable. I don't like the defaults. I configured it to add a post topics like a post1, post2, post3, and so forth. There are a lot of configuration options that you can use to tweak it to your liking. Now, the second piece of the puzzle is a feature of a service called Build Stages. It was actually launched as far back as last year. But nine days ago, it went from beta to an official feature. So the point here is that you have separate stages where all the jobs in a single stage run concurrently. And when all the jobs have finished, will it launch the next stage? So what you can do with this is you can make the deploy stage the last one to ensure that all of the actual test jobs succeed first. Travis has had a PyPI deploy feature for a long time, where after the build, it will package the project and upload it to PyPI. The trouble was that if you did that without any extra configuration, it will just assume you have multiple Python versions as jobs. It will then try to upload the packages in all the jobs. And even if you restricted that, it could happen that the build fails for some Python version, but not the one where you have the deploy configured. So you would end up with a broken release. Now, Build Stages fixes that. Also, I'd like to mention that sometimes you need experimental builds. I think PIP has a build job on Python nightly. You can configure it so that you cannot be allowed to fail so that it doesn't really prevent you from releasing. But it's nice to know if it runs on nightly or not. Now, I would like to do a sort of practical exercise here, assuming internet works and nothing goes wrong. Let's build a trivial project. And it is to upload to test PyPI. Test PyPI, if you don't know, is a parallel site to PyPI. It's sort of a sandbox site. You don't want to upload these kind of rubbish things on the real PyPI. So in the first phase, we will create a new project on GitHub, like the project. I checked beforehand, and this name shouldn't be taken on Test PyPI. Yeah. This is something that is required. I was actually supposed to talk about it. Hang on. I'm sorry, taking all the difficulties. So what we are going to do is, after we have checked out the project on GitHub, we will add the code and tests. Then we will add metadata and packaging configuration that is setup.py and setup.cfg. Then we will add the configuration for talks and Py tests. And finally, the Travis configuration. So the requirements for this is that, of course, you need a GitHub account, obviously. And this is something that has changed recently. Travis is now migrating for TravisCI.org to TravisCI.com. It causes some difficulties, and the process has changed a little bit, but it shouldn't be a problem. In addition to that, you will need the Travis command line tool installed. It's a Ruby thing. And you need to run Travis login just once. I've never had to do it twice. But all into time. Let's first check out, create and check out the project. Skip clones. That was the intention. Now let me just open it in the editor. I think it's a high DPI mode, but we'll make it. Now, the next phase is the packaging configuration. For now, at least, setup.py is always required. It may change at some point, depending on who you are asking. But yes, it's a requirement. But the thing is, my personal favorite in a way to configure the metadata is to use setup.cfg, which is a setup tools feature that was fairly recently introduced. And that is what we are going to do now. But before that, we'll create some code. Yes, we just configure Python interpreter for this. Let's use pip and something like this, a test for it. A trivial test for this. Now we have a package and a test for it. One side note before we continue. There's also a file called pyproject.toml that you may or may not need. But in this case, just that it was not necessary, really. It may all be necessary if you don't have the required minimum version of setup tools. But other than that, it's not really necessary. And besides, it's only worked for pip 10 onwards. So let's go write those setup.py and setup.cfg now. And now, we need to use setup.tools.scm here. What was the keyword in another project? This is sometimes hard to remember. And you also need to make sure that it's in the setup requirements, like this. Yeah, that's pretty much it. Actually, most setup.py files are not this short. But that's because we are adding metadata to setup.cfg. OK, in addition to the metadata, we need some setup.tools.options. If you're wondering about this syntax, it's a special directive that tells setup.tools to find any packages in the root directory. Also, since we will be having tests, we will need something to run the tests for which we will be using pytest and docs, as mentioned before. Some projects add some kind of dev requirements.txt or whatever. My way is to add all the requirements as extras. We'll just use pytest for now. OK, I think that's the bare minimum. We still don't have a readme, but it's not necessary for this demonstration. But if you publish anything in a pypi, make sure you have a readme. Now we will need the talks in it. Let's say we want to test on Python 3.6 and 3.7. We want to be running pytest. The last part is just something a placeholder than where the positional arguments are put, if you add any to talks arguments. And in order to make talks install the testxs requirement, you will want this. OK, let's see how horribly wrong it goes. Right, this happens because I don't really have Git repository here yet. So Setup Tools SCM needs a Git repository and at least one commit in order to determine a version. So let's add one. We want to add almost all the files here. OK, let's see what happens now. I will get to that later, but I'm really new to ppen, actually. And I don't feel that the point seems to be with the ppen is to pin all the dependencies, which you don't want in the open source library. Someone could point out that I'm wrong, but we'll talk about it later. For now, we don't need it. We have a failed test. Yes. So this was intentional, believe it or not. Yeah, because the best way to validate your test is to have it fail at first. Really? So we will add this. And now we have passing tests. So it's all good. Now we just have to, we can just amend the first commit. OK, now we have the bare minimum, I think, to put it to the test pipe PI. So before we do that, we need Travis configuration. So for that to happen, it's easiest to just copy or Travis configuration from another project. That's what I do. Yeah. There's a thing called Travis setup PI PI. It adds some bare minimum configuration to your Travis configuration, but in my opinion, it just condenses everything and makes it pretty unreadable. So what I'm going to do here is run that very command but only get the encrypted password from there. That's the only thing you really need to change in addition to the repository name. So what I'm doing here is I'm copying the Travis configuration from the wheel project. We don't need code at this point or notifications. OK, so there are two things that need to be changed here. First is the repository name and the PI PI password. For the PI PI password, I'm going to run the Travis setup thing. Yeah, you should add minus, minus, and point, pointing to travisci.com because they have this bug in the client which they haven't fixed yet. And I provided them with the workaround, actually. But it should basically use dash dash and point and point it to HTTPS, APIJS, CI, and form. All right, so here. You don't use dash dash pro because it breaks things as well. OK, do I put it before or after setup? OK. I still pasteurized it against this endpoint. So do login again first. You know, I rehearsed this before and it worked just fine. The com, I created a new repository today and it works just fine. Yeah, the trouble is that. It stalls it on github. So there is this new API called github apps. It has different flow and the github basically quantifies queries whenever a user greys it so you don't have to synchronize it yourself. Yeah, I'm sorry for the trouble, this did work earlier today. It's a famous demo effect. This is the source version. And javisci.com used to be only for commercial use. So there is a flag dash dash pro in client, authorizing in that one. But the client assumes that it's a pain service and does something wrong. But there is an argument where you can put this endpoint explicitly in the open source mode. So that's basically how you act it around. Because otherwise, if you have to write the code, you will also generate encrypted password with a security key stored on that side, which will not work on this one. It will fail silently. OK, it worked now. It's a metric encryption. So the only thing we're looking at. Yeah. But things are in a flux at Travis, so I'm not really surprised by anything. Nope, I just logged in again. OK, now we have a Travis configuration here. Let's hope we have time to see the process to the end. OK, now let's see what happens if we push. We need to tag it first. Oh, I did. You did get add. Yeah, and then I called me to be at a GUI. Yeah, there's one more thing that we need to change. Because we don't want to pollute the actual PIPI. Yes. OK. Now we will add the git tag and push. Now let's go and see what happens as you meet works. Wait a minute. And this is .org, and you need to go on a different website. Are you sure? Maybe side.com in the URL. OK, great. Right. There were a few Python versions that we really didn't need. Oh, right. What happened here is, yes. Yeah, we needed 3.6 and 3.7, nothing else. Yeah, we are running out of time, so we don't have time to wait for the bills to complete. So let me just explain how this works. So we have two stages, test and deploy to PIPI. This last stage is only activated if there's a tag incoming. And this is a regular expression that should match the tag. In this case, it's major.minor.patch version. I'm sorry, Alex. As I said, I have to control time, and your time is up. And even though some problem happened, we still have some give some welcome to Alex again. Thank you.