 Okay, so hello everyone. I guess we should start now. My name is Piotr Żarowski. I'm from Poland. And this talk, the main purpose of this talk is to convince you that there's a tool that can make it a lot easier to initialize the packaging work of Python libraries. So you don't have to do everything from scratch. This thing will speed up a lot when you want to create a new package completely new one. So PyPI or Python Package Index is a repository of software written in Python, mainly libraries but also applications. And PyPI to Dev was created during DevConf in Hyderabad and it was meant to be a wrapper around SDDev. SDDev is a tool that converts Python packages into Debian packages and it does it really well. If you don't know it, even though it's not developed anymore, I still encourage you to test it. Mainly to use the PyPI install script which will install Python package as a Debian package. So it does what PyPI to Dev does but PyPI to Dev, I wrote PyPI to Dev because I wanted to focus more about converting the whole repository. So in the end I provided also the PyPI to DSP script which does many things that SDDev does but I couldn't extend SDDev that much so I decided to write something new. And PyPI to PyPI to Debian is something that will convert the whole repository. So PyPI to DSP, DSP stands for Debian source package, will create a single one and PyPI to Debian will try to convert a whole repository or a subset of PyPI repository. And the main features of this project is it uses PyPI metadata so all the homepage, outer, description and classifiers not only to get a list of projects but also while converting them into Debian packages. The ones that are not data that is not available in the tarot and it supports all the main interpreters we use in Debian so we'll see Python 2, 3 and PyPI. When it detects Sphinx documentation it will create a new binary package with regenerated documentation. It integrates well with Debian tools with DH Python, I reuse a lot of stuff from Python Debian library. For example if given packages already in Debian it will reuse the name, it guesses build dependencies in the sort of generates build depends field using names from Debian or if given library is not available in Debian it will guess the name which applies to our policy So it generates for example it also generates ATP email so something that Debian developers will be interested in. It doesn't send it so you can review the email and polish it a bit before sending. It's easy to customize, I will talk a bit more about it later. And for Debian Python modules team, something that team members may be interested, it integrates really well. For example initialize the Git DPM repository, it sets all the VCS browser source fields, maintainer and so on. And it even suggests commands how to create a repository on the alley of how to push which branches and so on, two lines that handles everything. So Debian Python module team members may be interested in this tool and so how to use it. The first one, first script that converts a single library, our application that is mainly written in the mind of converting libraries. You can just invoke the script name and library name, it will download the tarbal or the zip file, if it's a zip archive it will recompress it, repack it, so that into a format accepted in Debian into a tarbal it will fetch all the metadata from PyPI and devianize it and create Debian source package. If you want it can build a binary package as well, but you have to add another command. If you already have a tarbal, you can point it to Py2DSP and it will handle the tarbal without fetching the data from PyPI. If you already started packaging and you have sources unpacked, even if there are partial Debian directory present, Py2DSP can take and create everything that's missing. The last example is for Debian Python module team members. The profile option is one of the possible customizations. It will set quite some options in order to produce source package that is in align to our team policy requirements. It will also, for example, create all this git dbm initialization and configuration. The second script PyPI to Debian will try to convert PyPI or any other repository compatible with PyPI into Debian repository. It will convert only fetch all this of projects and convert all of them into Debian source packages. If you want to compile and build them as well, you can point PyPI to Debian into the tool you use to build packages. For example, as build, you can, if you point as build to unstable distribution, it will build all the packages for unstable. If you have a stable chroot or ubuntu or any other Debian derivative, it will build packages for this derivative. If you want to, for example, if you don't want to convert the whole PyPI repository, you can use filters like classifiers here. You can use multiple classifiers. This example will use only packages that are compatible with linux, but you can specify more. If you want to use those who work with editors, you can just pick the right classifiers and fit PyPI to Debian with it. The Python 3 option here will create only Python 3 binary packages. Killed by default, both pools will create a binary package for each interpreter that upstream outer claims given library works with. If it has a classifier that says this package works with Python 2 or PyPI, it will create binary packages for all these interpreters, but if you want to limit it only to Python 3, you can do that with the option as seen in the example. There are multiple ways to customize it. There are profiles, like in the example for Debian Python teams. It's really easy to create new ones. I created one for DPMT, one for Python applications packaging team, and one for OpenStack team, but it's probably older already because I created it like two years ago and the policy probably changed. Another mechanism is overrides. By default, both tools will search in current directory in the sub directory named as the same as the source package name, and you can use that or set PyPI to the overrides path environment variable, where you keep all the overrides. Overrides is meant for overriding settings or options for one package only. If you want to change something for all of them, you can customize team plates. Team plates of Debian control file, Debian rules files, or ITP email, if you want to have a different one, then you can just pick the team plates I created and modify them, and put them in different directory, and the Jinja team plate is used to render them, so Jinja is quite powerful team plate engine, so you can give statements and so on, to modify it and to customize it. Another way to customize it are hooks. There are currently two hooks, pre and post. Pre is invoked before Debian directory is created, so the sources are already unpacked, and the second hook post is invoked once the Debianization is already there, so both are just simple scripts, so if in overrides or in team plates in directory there is a hooks.pre or hooks.post file which is executable, then it will be invoked, and I use it for example to set up a GitDPM repository. If you have a completely different, want to use completely different source repository, like a private company repository compatible with PyPI that you use to keep libraries in your company, then you can point PyPI to that using these two variables, PyPI JSON URL and PyPI XML RPC URL, and PyPI to that will use it instead of official PyPI, so in both overrides, profiles or in team plates, in directories you can put just static files like Git build package configuration or Debian copyright file if it's static, if you don't need it to be changed, if it's the same for all libraries, you can use TPL files or team plates in GINJA, so if you for example know that this tool doesn't detect correctly upstream outer or you want to hard code the upstream outer to some string, you can either use the sticks, context JSON file and put all the values from these fields like outer, build the bands, if not all are detected, you can change the description, home page and so on, there are quite a lot of them, so you can do that either in context JSON file, if you are upstream outer and you want to help PyPI to DSP or PyPI to Debian, you can use setup configs Py to DSP section, it will read all the options from there or you can hard code them in TPL files and pre and post codes are simple shell scripts which get as options when invoked they get package name and version and if you need more the city context JSON files are created as well in the build directory, so you can read them from scripts and use all the detected values from there and if you don't like GINJA, I want to use your own team plate engine or for whatever reason you can just use the detection mechanism and invoke everything else from the script and that's the first part, I will make, we probably don't have time, but I will try at least show you a few examples, but in the meantime if you have any questions please ask, is that big enough? so as you can see there is an ITP email team plate, if the upstream outer add change logs into package description, so why DSP is clever enough to recognize that and remove it and a few more hacks I added to polish it a bit, but it probably needs more work, but anyway the ITP email looks complete, there is a description, there are binary package names and all the fields from ITP email there are examples of detected, so it's the right package, interesting, I will just show you control files, so as you can see detected Python 2 and Python 3 interpreter and there is a separate binary package, upstream outer probably didn't mention PyPI, that's why it didn't create one, but let's try to build it and in the meantime are there any questions? There is an address, the Python binary package has the same name, you can apt install PyPI to that and try it yourself, thanks