 It's almost ready. It's Ben Natl. He will talk about his open-source Python project tools for maintaining an open-source Python project. A walkthrough of some great tools I use for developing, testing, maintaining and managing projects. I see he's online, so please share your screen and let's go. Okay, let me get this full screen. Okay, can you see my slides okay? Perfect. Great. Okay. Well, thanks for that. So, yeah, I'm going to be speaking about tools for maintaining an open-source Python project. I'm a software engineer at the BBC. I work in an innovation team called BBC News Labs. I joined there in January from the Raspberry Pi Foundation, so I was there the last six years, so if you've seen me speak before, I was probably advocating Raspberry Pi. I created the GPIO0 library and the Pi Wheels project and also one of the contributors towards the Pi Jokes project. I write for opensource.com and you can find me on Twitter and that's my GitHub. This is what I used to look like in real life and now I kind of look more like this, as you can see, lockdown lack of haircut, and this is what I look like online. This is my avatar. So, just to give you a brief of what this talk covers, I'm going to be talking about how to organize a Python module and how to structure the files, how people distribute software and different methods of doing that and why we do it. Using Git and GitHub to version control and publish and share and collaborate with your software with other people. I'm going to be touching on virtual environments, testing and testing your software and automating your testing, documentation, documenting your code and your project, and a little bit on licensing your software. What this talk is not, so this is not going to be a thorough follow along tutorial because I mentioned about 50 different tools and I'm just going to be mentioning them mostly in passing. You're not going to be able to follow along and kind of do examples and things as I go along. I'm just going to kind of quickly brief over them and give you the big picture. If you want to learn more about each of them, I'll tell you where you can find out more. I'm also not going to be telling you which tools you should be using or which tools to use. It's not my job to be telling you which tools you should be using. I'm just sharing the ones that I use and if I've not mentioned another tool, it's either because I've not come across it or the examples I want to share with the tool I've chosen. Everyone has the right to choose whatever tools they want to use. I'm also not telling you that in order to be considered a proper Python programmer, you need to know about all of these tools and need to know them inside out. I hope that comes through. Just to give you a background on a lot of the contents of this talk, where they've come from. The GPI Zero Library I mentioned, I developed this when I was working at Raspberry Pi. It's a Python library providing a simple API for physical computing with Raspberry Pi. It eases the learning curve for young people, beginners and other educators. It's a nice Python API. If you're an advanced programmer, if you're an experienced programmer, it's not just an easier way to do things for kids. It's also quite a nice way to do things once you know good Python to be able to write nice Python code with your GPO and your physical computing with Raspberry Pi. And you find the docs in the GitHub project there. I just wanted to share that Kido started playing around with a Pi last year and he tweeted saying it about how he loved the library. So I'm a bit of a humble brag there. I'm quite pleased with that. PiWheels, my other project. So this is the tooling for automating building wheels of everything on PiPi. So wheels are binary distributions of compiled Python libraries, modules. PiWheels.org is the repository that's built from the tooling. So it's a whole repository like PiPi.org. It's a PIP-compatible repository that hosts ARM wheels that have been built by the tooling that is PiWheels. We natively compile ARM wheels built on Raspberry Pi 3 hardware targeting Raspberry Pi users. And the repository is hosted on a single Raspberry Pi in a cloud platform. And that single Pi serves over a million downloads of wheels every month. That's at PiWheels.org and the source is on GitHub. So with these two projects, I work with a friend of mine called Dave Jones. He's a professional programmer and amateur dentist. This is a recent picture of him before he performs some dentistry on his partner over the lockdown period. And I'm using this photo with permission. Dave is responsible for implementing my crazy ideas. So with things like GPIO zero on PiWheels, what generally happens is I come up with something and say, well, this would be good if we could do this. And he ends up implementing it. The way we tend to work is I write the first 90% and then he goes on and writes the next 90%. So Dave co-authored GPIO zero on PiWheels with me and he's also got a bunch of other really, really cool projects that he's built himself as well. And Dave introduced me to a lot of the tools that I'm using in this talk. So I wanted to give him a hat tip for that. So when we start writing a Python module, it usually looks something like this. You just have a Python file named after your project, whatever your project is. You write your code in there. That's how, generally speaking, that's how projects start. So you start with that. Now, you might want to throw that up onto GitHub. So you can use, you can create a repository and push your code up to a personal GitHub repository. So for instance, under your own name, so this is my username on GitHub. And this is a project that belongs to my own user account. So you push it up. And the way GitHub works is at a very base level is that it's like providing, so this is a folder containing a single file. And if I create a GitHub repository of that and push it to GitHub, it will essentially put the folder structure of my project online in a really, really basic way. And obviously it does much more. You can also create a GitHub organization for your project. So especially if you have a project which comprises multiple components, multiple different repositories, you could have different repositories under a particular organization name. So this might be a company. It might be an open source project. It might be a wider group or something like that. So GPI-0, for instance, you can actually move things from a personal account to being under an organization. So I did that with both of these projects. GPI-0 and PyWheels have their own organizations and the multiple repositories belong in each organization. So GitHub provides a way to add collaborators to your project. So you can invite individual GitHub members, GitHub users, or you can create a team and say these people have access to these repositories or these people have read access if it's a private repository, that kind of thing. And with an organization, obviously you invite them to the organization and they have whatever access that you've given them. So GitHub has branches. And so with GitHub, when you push up multiple branches of code, you could be working on one feature that's not quite ready to be merged yet and you could be working on it and share it online on GitHub and other people can see it. But also other people could be working on other parts of the project on different branches and they could be managed that way. And with GitHub, you get a visual representation of what's going on there. So with GitHub releases, so if you tag a Git commit with a release name, then you can share them like this and you can see the different points at which versions have been released. GitHub provides issues, which are a really good way of both you accepting bug reports from your users and also for yourself to, as the maintainer, to kind of drive the development of your project and your roadmap. Actually, the way I tend to do things is if I want to see a feature in my library that I'm going to implement, I create an issue saying it would be nice if we had this or it would be good if we did this and describe it in whatever detail and either I get around to doing it or somebody else might be able to pick it up in the future and do it themselves. Commit the code and close the issue. And you can tag issues with labels and organize them in different ways. Pull requests. This is a way for once your code is on GitHub and accessible to others, other people could take a clone of your project and commit some code, push it and request that you merge their changes in. So a lot of people are able to contribute to some of these libraries because it's out there on GitHub and they can contribute and you have the ability to modify or reject or merge changes as appropriate. And GitHub also provides project boards, which are either a way of you organizing your existing issues and pull requests, however you want, but also you can create little notes which are just not not issues, but just little bits of text that say things describing your project and features that you want to add or things that you need to address and be able to visualize the state of play, especially if you're collaborating online and not office-based. Having visual representations is something you might have Post-it notes for in an office can be really useful for managing the project. So distributing software. So how do we do this? So it's quite common for software to be packaged in such a way that it can be installed by many users. And for instance on Linux, you might expect to be able to install some software with the apps tool. So act install, so-and-such, or Fedoro, RPM and Yo-Mun on other systems. And then you've got things like PIP, which is a language-specific package manager. So PIP is Python's package manager. NPM is for Node.js and Gem for Ruby. Then there's things like Linux Portable, so Snap, Flatpak, App Image, and different methods of distribution that have pros and cons. And they're quite popular at the moment. And then on Mac, you've got Homebrew, so you can brew, install something. And then there's sort of lesser sort of quality access to software things, like just downloading it from lots of less sophisticated ways. So downloading from GitHub directly or, you know, I've talked about GitHub a lot. And I should mention GitLab and there are other alternatives that will provide a lot of the same sort of functionality. Downloading from SourceForge in the olden days, or providing something accessible for download on your personal website. And things like Curl as well. But different methods of distributing in software. So why do we distribute software? So first of all, for ease of access, so you kind of, if you make some software and you want other people to be able to use it, they need to be able to download and install it. So if you can do that in a uniform way that matches their expectations, then it'll be much easier for them to use it. If I'm using Linux or, you know, Debian or Ubuntu, I sort of expect that, you know, if software is available, that it's available in app for me so I can app to install it. And I kind of expect that it's just there. Or if it's a Python library or something, I might expect that I should be able to pip install this and not have to go and find the website where it's hosted on or the obscure method of downloading it that they provided. And with, especially with apt and things like that, that you have a certain amount of trust and confidence that what you're getting is quality and that it's the real deal. It's from the author themselves and that you're getting it from an official source. And that for the stability. So you know that, you know, this is coming from the right source and it might be especially in something like Debian. You know that this is a stable version that's supported in Debian. And if it's on pip, you might feel, you know, you can actually go there and look and see these are all the version numbers. These are the release dates, what version I'm on and you kind of know where you stand, which is really important. So licensing is important to talk about at this point. So it's really important to choose a license for your project. Now it's really easy to just discard this and say, well, you know, it's just open source. If it's on GitHub, then people can do what they want with it. But actually what people don't consider is, well, if this happened, would that annoy me or would I be annoyed by somebody's use of my of my project if they started selling it with that? But I think, well, that's not fair. If they started using it a particular way, if they renamed your project, if Google took your project and renamed it and released it as, you know, under their own branding, would you be happy about that? Would you rather choose a license that protects you in whatever you want to be protected? There's lots to think about. It's not a simple issue. I'm not going to recommend any particular license. If you go to choosealicense.com, that's a great resource there for describing what it is that your your project needs and what your needs are, and it will help you choose a license that's appropriate. It's also important to say that it's important to include the license with the source code. So include it in your GitHub project, include it in your files. And when you make a distribution that you that you share, if you're publishing it to PyPy, and it's pick installable, that the license should follow the code wherever it goes. So when somebody installs it, the license should be with the code there. It shouldn't be left to, you know, this came from PyPy and the licenses on the GitHub page and that kind of thing. So it's important to keep it with the project. So if you just if you want to start creating a Python module, regardless of which method of distribution you're going to use, essentially you want to start with this. So you've got your project.py that we had before. So that's where your sort of implementation of your project lives. If you stick that in a folder or directory called, there's the name of your project. And you need to create an init.py with double underscore in each side. So this is the name for this is dunder init, which means double underscore. So you stick an init.py in your project folder, have your project code in another file, and you have a readme file. Now I'm going to be talking about different formats for readme's and documentation later, but this is a restructured text file. And you have a setup.py. So the setup.py might look something like this. This is a reasonably minimal setup.py. So this describes how your project is built and how it's which modules it provides and things like that. So it's using the setup tools module. So it essentially runs on this setup function provided by setup tools. So set up and you provide it all the different information about your project. So you give it a project name, a version number, well, it doesn't not strictly a number, it's a string, but you can look up about how people tend to version their software. You give it an author name, a short description, just a one line string, a license, you can provide keywords, a URL to where people can find the project. If you've got a homepage or if it's just on GitHub, you can put it there. Now packages, I'm using the find packages function here provided by setup tools. All that essentially does in this case, because it's a really simple example, is that that will return the string or a list, I think, of the string project, which is the name of the folder, which is this bit is what becomes importable, becomes distributed on the system when somebody installs it. But find packages will go away and find any modules that are available provided by your package. And then a long description is what will be shown. You'll see later on a PyPy page, the full description of what the project is, which usually you'll read me in the GitHub project. And it's good to be able to replicate that both on GitHub and on PyPy. I'm just using a read function there to open it from the file. So if you want to publish your Python module on PyPy, that's the Python package index. So first of all, you register an account on PyPy.org. You create a PyPy RC file with your credentials that you created, so your username and password. And you want to install a tool called twine. And if you look up on the Python packaging documentation, it didn't used to be as good as it was, but it's got a lot better recently. So it's a really good documentation that you can find out on how to go through this full process. But that's the gist of it. And once you've published your module, you can see that it can be available as a PyPy project page, something like this. So this is the one line, the short description. This is the full description, which I haven't really made much use of. And the different release, version releases and the files that are available and a link to your homepage and all that kind of thing becomes available on PyPy. So init.py, there are different choices that you can make about how you structure this. So I mean, it's possible to just write your full implementation of your project inside init.py, but people don't tend to do that. The two account of schools of thought that I use are, so this is one of them with GPI 0, we want to make it really easy for, it's just a library, so people just import things from it. So we want to be able to make it easy for them to import it and not worry about a nested structure of different, where different things happen to be implemented in different files and different folders. So we want to be able to provide from GPI 0 import LED or button or servo, motor, that kind of thing. We just want them to be able to import all the bits that they need at the top level namespace. And so in init.py, we kind of use relative imports to bring in everything that we need, whether they're scattered around in different files in different locations, we import that and provide it in init.py, which means people can import it easily. And then the setup.py contains things like the version number, and that goes straight into setup, and it isn't being imported from here. With a library like this, where you've got code in your init.py, it's tempting to try and put your version number and things like that in your init.py, so that people can import it and see the version, but it can cause conflicts if you structure it like that, if you're providing imports and things, because when you run setup.py, it tries to import your code, which might have dependencies, and then you might start importing things, and that might cause you problems if, for instance, your dependencies aren't available at the time that somebody's trying to build the project. And so the alternative way of doing it, if you're not actually trying to provide, if imports and the import structure isn't the most important thing, if your package is a module that people install and they get access to command line tools, for instance, rather than a library that fit things that they import, this is a good way of doing it. So actually putting all your version number and all your setup.py metadata inside init.py, and then importing it from your module and passing them into setup. And another thing is entry points. So entry points are a way of providing access to parts of your program that you want to make available as console, what we call console scripts. So if you want to make a command line tool, where the word, the command project, for instance, launches some part of your program, of your project, you would do like this. So you provide entry points in the setup function. You define entry points as a dictionary. Console scripts are one of the types of entry points. There are others. And then that's got a list of each command that you want to provide. So project, it's a bit odd that the syntax is like this, that it's all just in one string and that the dot here and the colon are kind of syntax within a string, but this is just how it is. So this essentially makes the word project available as a command. And it finds the main function in the CLI file in your project called project. And once it's installed, you'd be able to do something like this. So virtual environments are really a good way of creating an isolated environment that you pip install your requirements into in your package. You can actually build your projects inside the virtual environment and in such a way that the changes you make in your library are sort of installed in real time. So if you make changes, you can, it's as if you've got the latest version of your project installed in the environment. And you know that it's separated from your wider environments. It's not got your system Python. It doesn't have the system packages they're installed. And it's just isolated from everything else. I recommend a tool called virtual end wrapper, which provides this command make budget MK virtual ends. And with this command, you create a virtual environment called project. And as soon as you've run it, you'll see the word project in brackets below in front of your in your shell. And then if you're on Linux, for instance, you might be used to using Python three as your as your Python, because that's the system Python. But once you've created a virtual environment, you can tell it to use Python three, but then Python and pip become the they point at your Python and your version of your pip. So you can see I've got which Python, and it's put it inside environments project in Python. Use the deactivate command to close the virtual environment. And then you want to switch to another project you can use work on. So the first time you do it, you don't need to do that but because it creates it for you. But if you want to revisit one, just use work on and the project name. Make files. So this is a thing I imagine a lot of people are a little bit not skeptical of sort of almost afraid of this. They seem like quite a complicated archaic tool. But if you strip them down to their basics, they can be really useful and actually quite simple. So for instance, everything I've showed you so far, I wouldn't, you know, you need to be able to be able to provide away people installing from the source code. So pip install dot and you just provide the command make install, which wraps around whatever your installed instructions are. And make develop, which in this case, install it in an editable way so that people can develop on the project in their virtual environment. And just, you know, you start small with something like this. And later, once you've got, you know, things like test suites and documentation builders and deployments and all that other other things, you can define inside here how each of them should be and then provide them in a really uniform way. So make install, make develop, make test, make deploy, whatever it is that you've gotten, you can just, you know, there's a lot more complex things you can do and lots more you can learn. But I think they're a really good way to get started. And like all of things I'm going to be talking about, the best way to learn more is to take a look at other people's projects and see what they do. So testing next. So the whole point of testing is the idea that you write tests to validate what your code is supposed to do. You keep your old tests around to make sure nothing breaks in future. And if other people are working on it, they don't need to know about those tests, they just need to run them, run the test suite. And if they introduce a bug of some code that, you know, you wrote a year ago, five years ago, the test suite will tell them about it. So there's an approach called test driven development, TDD. So for maximum effect, if you're taking that approach, you write the tests before you write the code. So you kind of write by wishful thinking and say, well, I think the library should do this and you write how the user would you would write it. And you say, well, I assert that this would happen when they run this function. And then you see it fail and then you go and write the code that actually makes it pass. And then, you know, you kind of drive yourself forward in that way, which is an interesting and useful approach. So you can, you know, you can write tests that run really quickly. And it's important that they do run fast because then you're not held up by waiting for your test to run. And it can be automated. So once you push, it can run on something like Travis, which I'm going to be talking about, automatically so that you can so you can just see instantly on, for instance, if, you know, somebody else writes some code, sends you a pull request, you can see, did the test pass or did they add any new tests, that kind of thing. So important to be pragmatic when you're when you're writing test test edge cases, don't exhaustively test every single possible combination of inputs. It will run slowly and it's not an effective way of testing. But it is difficult. It is an art form. Writing writing good tests is a complex task. And, you know, like, like all these things, it's a learning curve. So having test is better than not. But having too many tests or having exhaustive test is not that useful. So the easiest way, I think, to get started with testing is not using any testing libraries, not installing anything, just using a built in keyword, assert. So if, for instance, your project defines a function add, which takes numbers and adds them, you can just import that function and assert that add to comma two equals four. And if this didn't return four, it would fail and there would be an assertion error. So just a standard Python exception. But if it passes, it just carries on. So they're really, really useful where just really quickly testing things. A good way to structure it is to put them in functions like this. So have test add and then have, you know, have multiple tests in here. And with PyTest, if you, which is a really cool testing library, but it can, at its most basic level, can be a really nice runner for your standard tests, your assert tests. But if you name the functions like this and name your files like this, so you have a test folder and you name your files test underscore something and have your functions named test underscore something, it will run them. And so you can see my structure looks something like this. I've got test add and that contains a test called test add. And you can see that when I ran that it passed. It's just bog standard simple example, but you can imagine for much bigger projects, you'll have reams and reams of tests passing and seeing when anything fails when you've broken anything. PyTest also gives you some additional features. The main one I use is testing exceptions. So it's quite difficult just using a cert one. It's impossible using a cert on its own to assert that an exception got raised because that will blow up your program. So the way you do it is you import PyTest and you say with PyTest it raises some error and you use the context manager and put the code that you're expecting to raise the error inside. Now, if it doesn't raise the error, then the assertion fails. So that's a good way of testing that as well. Mock is a really good library as well. So this is since Python 3.0 something, this has been in the standard library in the unit test module. So this is a really simple example of using mock in your test. So you can create a mock object that in this case contains a method called message that just has the return value of hello. So you're kind of mocking up an object that has an attribute, that's a function that has a method that has some predetermined return value. And so you can see there I've got my mock object, that's the wrapper, and when I call m.message, I get the string hello back. And another thing that mock comes with is something called patch, which is a good way to patch some functionality that's not in your library, but perhaps your library relies on. So something like this, this is from the JBL0 test. We have an interface for dealing with the time of day. So it's a time of jail object is active between the times that you set and it's inactive outside of those times. So you could wire this up to say an LED and say, well, this LED should be active when this time of day object is active. So the LED is on during between the hours of seven and eight p.m. Much the same as a button could be connected to an LED and you could press the button and that is what controls the LED. This is a time construct rather than a physical button. And so with that, I'm obviously using date time underneath and so I have to patch the instance of date time within my library and say, well, I'm going to test it. I'm going to say, well, when they call date time, the first time, I want it to return this particular date. So at 6.59, I should assert that the time of day is not active because it's not 7 yet. And then I should be able to tell it the next time you call date time return this. And it's now 7 a.m. And now the assertion is should be should be true. And then 8 a.m. It should still be true and 801. It should have gone back to being inactive. So it's just just all I'm doing is patching date time. I'm still actually testing the library still doing an effective test. But it's the thing underneath that I can't control rather than have to subtract and say, well, take the current time and add a minute and blah, blah, that's a much simpler way of doing it. Tox is a really cool tool for running your tests in multiple Python versions. So if you're on Ubuntu, you can, if you look up something called the dead snakes PPA, you can apt install multiple Python versions, not just the one that comes with your distribution of Ubuntu. And all it takes is a Tox configuration file that describes which Python versions you want to run your tests in, and you have to have them installed, or otherwise it would just give warnings saying couldn't find this Python. So that's a really good way of just on your machine being able to run the tests in multiple Python versions. There's a lot of times, you know, if you're, if you still support an older version of Python and you're using a new bit of functionality like fstrings or something like that. But yeah, it's passing on your machine because you're running Python 3.7 or 3.8. But then you see Tox tells you, oh, this one failed because I don't know what this fstring thing is. So it's good to be able to do that. Coverage.py is another really cool tool that does coverage analysis of your programs. So based on your test suite, it checks whether, checks which parts of your code, which lines of code have been touched by your tests and which ones haven't. And if you've got something like a tree, like a, you know, an if or a four or a try accept or something, and it's going, there's multiple different ways that it could go, it will identify which branches of those trees didn't get touched. So it could even be something like, you know, it always goes into the if and never goes to the else or, you know, something like that. And it just shows you, like, actually, you are not testing this part of functionality, which is a good way of finding, you know, getting good coverage of your tests. It's not completely foolproof because it's really easy to just obey the thing and fill in all the gaps, you know, and sort of hack your way through it. But, you know, again, it's an art, lots to learn, but it's a good indication. So coverage for GPO zero, for instance, gives you something like this. These are all the different files, and it shows you which lines are missing from the tests. And, you know, if something's at 98% or something, perhaps you're not that bothered, but if something's a lot lower, you might want to go and investigate, well, actually, we're not testing large parts of this, of this file. So Travis CI, CI is continuous integration. So this is an online service, which is free to use if your project is open source. And you can define which Python versions to run, but as soon as you push to GitHub, or if there's a branch or a pull request, it will run all your tests on the Travis servers and give you a report saying, you know, it passed on all these versions or it failed on 3.5 or whatever, which is really useful. And it also feeds back to your, as does code coverage, they feed back to your GitHub. If it's a pull request, for instance, it will post an issue, sorry, post a comment on the pull request saying, yeah, all the tests passed, but the code coverage went down by 1. something percent, that kind of thing, which can be really useful, both for you as the maintainer and for any contributors that did the file, the PR. So again, just revisiting MATE files, example here, because, you know, the tests, it's sort of a non-trivial, it's not just a case of you type PyTest and hit enter. It's because I'm using coverage aligned with PyTest and I'm using a particular configuration file. Defining that in here just makes it much easier for people because they just know I run MATE test. And if that changes in future, if I'm adding another library that I'm using underneath or changing it somehow, or the PyTest command changes, you know, it's still there, MATE test still works, you just update the definition. So again, for stuff like this, it's really, really useful to have simple MATE test for that kind of thing. Documentation. So there are kind of, according to Daniel A of Divio, who's a friend of the Python community and previous chair of the PyCon UK conference in society, he does a brilliant blog post on this, which you can read up on the Divio website. There are four types of documentation. He says there's tutorials, there are how-to guides, explanation, and reference. So reference is one quite common one. You'll find people document their APIs. So I'll say this is a function that has this. This is a method that works like this, this takes these parameters, that kind of thing. But they'll also kind of bundle in things like backstory and, oh, you know, this is an in-joke and, you know, oh, this is how you install it. And if you're on a Mac, then you do this and it kind of bloats and becomes really messy. So his whole proposal is that we should be splitting these into those four things. But he gives a brilliant talk about that whole concept, which is really and really worth reading about it on their website as well. But yeah, documentation is really useful. So again, looking at really easy ways to get started with these things and looking at more advanced routes as well. So a really easy way to document stuff is just put readme files in your GitHub repository. So write them in, in Markdown. And if people, if somebody's looking at your project, even if it's not published onto PIPI or it's not a Debian package or whatever, just being able to come across it on GitHub, you can read on the Readme. So this is just a couple of examples of what Readme, what Markdown documentation looks like on GitHub. And so Markdown looks like this. So it's really simple syntax, really, really, really simple syntax for writing stuff. So you've got a hash here, which is a title. Two hashes is a header. Just text on its own is just a paragraph. Use hyphens or asterisks to make a list. And a link looks like this. You put the square brackets around the text and round brackets around the link itself. There's also a project called MKDocs, which is a Markdown-based documentation builder. It exports static HTML, full websites of your Markdown documentation. Easy to write. It's easy to deploy. And you can self-host it or put it on GitHub pages or something like that. There's lots you can do with it. Restricted text. I find a much, much more stringent sort of market language. Quite complex. Quite a learning curve. But, you know, in essence, this is similar to what I showed previously. So you've got a title. It's a bit more, for both, you've got a bit more stuff. A header to list items. But this thing is something that's a little bit different. So this is a link that's pointing to another documentation page. So another page within the project. And you can do things like that. They're a little bit more clever, a little bit more sophisticated. And it will actually get the page title because it has context to what other pages are. And it will include the page title in a link that way. So there's a project called Sphinx, which uses restructured text. What's really clever about it is you can extract the docs from your doc strings. So if you write doc strings anyway, you've kind of already written your documentation and it will build you a site out of your doc strings. You have the power to choose which pages and where things go, for instance. You can also link to the Python documentation. So if you link to a Python function or a Python class from the standard library, you can also link to other Sphinx projects. Perhaps your dependent libraries. So Sphinx, for instance, something like this, you can write a page and say, well, I want to have this paragraph and this title. And then for each class, and you just say auto class, and you tell it which parameters to provide and which methods and things to provide, and it will automatically grab them from your documentation and fill them out like this. You might be familiar with Sphinx because the Python documentation and a lot of projects, okay, a lot of projects in the Python ecosystem are used Sphinx. Read the docs is quite a common method of deployment for those. You can have multiple branches, multiple versions, and access whole projects that way. It's really easy to automate as soon as you do a release. It automates a new build of your documentation on every branch or every new release. Graphvis is something I use in my documentation as well, a really cool way of creating little graphs to describe parts of your project. I won't go over this, but this is just a way of describing the relationship between two boxes and that kind of thing. And you can do more complex things like class hierarchy diagrams which are automated from your Python code, which can be really cool. So we've got a load of stuff in our project now. So just to summarize what we discussed, so how to organize your Python module, the module structure, distributing software, PyPy and PIP, using GitHub and all the different tools that it provides, virtual environments, testing and automated testing, documentation, and software licensing. I tend to write about tools like this that I come across. I've got a tooling blog at tooling.bennotall.com inspired by my friend Les who does something similar and I kind of post on there every now and then. So if you're interested in this kind of thing, new tools that I come across, do follow that. And that's all from me, thanks very much. Thank you very much. We already have questions here. We are a little bit late, but we have enough time for two questions. The first one is you mentioned for Linux, those packages, PIP, et cetera, and also for Mac, what about Windows packaging options with Python? So PIP is compatible with Windows. So, you know, a lot, for most things, most projects you'd be able to use PIP exactly the same on Windows. I don't know a lot about Windows packaging beyond that, but there are people out there in the ecosystem making, I know that making it work, and there are some really good Python community members that work at Microsoft and working on those kind of things. So I don't have any particular expertise to be able to answer that, but I know that there are options here. And Python itself, I know, is in the Windows 10 store now, so I know it's easy to get Python and it comes with PIP. So yeah, you can use PIP, but there isn't necessarily equivalent of something like apt for Windows, not quite in the same way. I think there's something going on at the moment, but not quite a complete picture of the open source ecosystem, the way there is on, say, Debian. Okay, there is another question, and also unfortunately the last question for this session. What are your thoughts using GitHub actions instead of Tram-ECI? I haven't used it yet. It looks really interesting. I've been meaning to take a look. Yeah, definitely worth looking at. I think there are some people who I've seen in the Python ecosystem using it and seen good things about it. Okay, thank you very much again. Thank you. And if you want to ask more questions, please go to the Discord channel. You can reach that by pressing Ctrl or Command K, and then typing Maintaining, and then you see the first result, search result is the channel for the talk. And I see there is already some action there, so please continue there. Thank you very much.