 All right. Good morning again. Welcome to your technical content for DrupalCon on Tuesday. Very happy to be kicking off your Tuesday for you. Today we're going to be talking about workflows for PHP libraries. I'm Greg Anderson. I'm an infrastructure engineer at Pantheon and an open source contributor. I've contributed to things like Drush and the RoboPHP Task Runner, Composer Utilities and things of that nature. And today we're talking about the process of using generic PHP libraries in the context of Drupal. You might have a Drupal 8 site, you might have another site that's still running on Drupal 7, and if you have some technology, especially if you want to interface with some outside system, it's natural for you to want to be able to factor out your common code into an independent library that you can reuse in both places, and then just make a Drupal 8 version of the library and a Drupal 7 version of the library, and you're good to go from there. Optionally, you might want to stick a command line tool in just for testing and automation, and that's another thing that is more easily accomplished if you have a separate library. This is an example of what a Composer Jason looks like. I'm sure many of you have seen this before. If you add a minimal Composer Jason to your library and put in a required statement, then this tells Composer that it should load in this library whenever it's using your module. Now the way the Drupal community has sort of evolved for using Composer with modules is once you have one module that uses Composer for dependencies, then you should manage your entire site with Composer. You should use something like the Drupal Composer Drupal Project template that has both Drupal 7 and Drupal 8 templates. Follow the pattern there, and then Composer will pull in all of your modules, and it will also pull in all of the dependencies of your module. Other solutions like Composer Manager are really deprecated at this point, and you should stick with a straight Composer implementation. So what is our workflow that we're trying to set up here? We're talking about the process where you build a PHP library, you test it with PHPUnit, and then you drop it into a Drupal context and maybe do some more functional tests with Behat, and you repeat until your module is done. In this session, we're really going to focus on that first step of building the library and testing it and making sure that it's perfect. However, PHPUnit isn't the only thing we need. We also need collaboration, reproducibility, analysis, documentation. There's a whole ecosystem of things that you need to be concerned about if you're going to collaborate with other people on software, whether it's within your team or in a community. Some of you may be familiar with George Jetson. George Jetson had a really demanding boss. Sometimes George would come home at night and he'd say, oh my God, my boss is a slave driver. He made me push the button three times today. So that's the workflow we're going for. We want to go for the George Jetson workflow. When it's time for you to roll out your module, you just push the button and the right thing happens. Work hard once. Don't do that stuff over and over again. Then you can complain to your significant other when you come home. Oh my gosh, I had to run CI three times today. But all things take effort. You're going to win the benefits of having a George Jetson workflow. Then you have to spend some time to set it up. There's a fairly well-known XKCD cartoon where the author says, hey, how often do you do this thing? And how long does that thing take you? And from that he built a little matrix. He says, okay, you're allowed to spend this much time no more on making your tool better, otherwise it's a waste of time. And this is a funny cartoon because as engineers, sometimes we get so focused on improving our tools that we never get onto the real product. And that, of course, isn't good. But what this cartoon misses is that not all time is equivalent. If you get a phone call from a customer, and they say, my God, the site's down, and you trace the bug down to a bug in your library, if you're depending on manual processes and it takes you longer to respond to that incident, then that's a lot worse than if you'd spent an entire day making it possible for you to respond to incidents quickly. So consider all of these other factors when you're deciding how much time to spend on this. Onboarding new engineers and just reproducibility and the increase of content makes automation really worth it. Fortunately, there's a whole bunch of services that come to the rescue and make it a lot easier for you to get this automation done. We have past runners like Travis that will do tests for you, a scrutinizer is a really cool tool that will do code complexity analysis. Of course, if you're going to use Composer, you need a package manager and packages is the standard one. Coveralls will give you reports of how much coverage you've got on your various PHP unit tests. Version I is a neat little tool that will tell you about your open source license usage and all of your components. And of course, there's also a number of tools that will help you publish documentation online easily. And these are the different things that we're going to be talking about today. So in these examples, I'm going to be using a few projects as templates. One is Elcash Elcash. This is a caching module that was developed at Pantheon. But it's general purpose. It's not even specific to... not just Pantheon independent, but it's not Drupal specific. It's an independent library. It works on WordPress as well. Consolidation is a collection of projects that have utilities for symphony console applications. And finally, I just threw in an example website. It's an old Drupal 7 site. It has some automation. It's not running on Pantheon. You have something that you want to set up, then you should be like Wally and utilize code reuse and see what else is out there. Everybody's using GitHub, which allows you to collaborate through the browser without having to set up Git, and there's lots and lots of integrations. This is going to be the cornerstone technology that enables the things that we're going to be talking about. But I also want to go over how to get the most out of GitHub. As GitHub is going to be your front page for your project, if you care about other people collaborating with it. There's some neat features of GitHub that in some cases are underutilized. GitHub is a good way to go because if you click on the pencil, then it brings up an editor and people can submit full requests right in the web browser, which means you don't have to learn all of this command line, Git stuff. And in some cases, that sort of collaboration is really useful and helpful. I'm going to skip this part. If you have a good readme, badges up at the top will link you to your other integrations. It tells people what integrations you have set up, and it will give a good indication of whether this project is still being maintained and how the code is looking. If you add a contributing document, just commit a file called contributing.md at the root of your GitHub repository, and there will be a link to this page any time someone submits a new pull request or an issue queue. This is a great place to tell people what sort of standards you are using in your project. Drupal Core as a whole isn't using PSR2 yet, but if you're outside of Drupal, the rest of the world pretty much is. So when I'm making independent libraries, I like to enforce PSR2 and advertise a pack that I do in my contributing document. Similarly, GitHub issue templates will fill in the... Instead of getting a blank slate when you're starting a pull request or an issue, it will give the user something to fill in, which improves the quality of your feedback. Next, I'm going to be talking about packages. If you're making an independent library, you're going to have to make it possible for the composer to recognize it. And packages is a very convenient web-based system for describing where to get your projects, but the integration is a little bit tricky to set up the first time you do it. The good news is this is the hardest integration we're going to talk about today. I'm going to go over a little bit how you describe a project such that packages can understand it for you. Up at the top, there's your project name. This is actually optional in Composer Jason if you're not publishing your project, but it's strongly recommended. And the other little trick that's really useful that many projects fail to do is in the extra section, if you create something called a branch alias, then you can say, in this case, we're saying that the dev master branch is equivalent to the 1.x dev branch. Now, the reason this is important is if you have one component that's asking for the development version of your library and another component that says, I work with 1.x, if you don't have a branch alias, then Composer won't know that your dev master is compatible with 1.x, and then you won't be able to use those two modules together. So put in your branch alias faithfully and it'll be easier for people to pull in dev versions like off of a pull request. So starting off in Packages is really easy. You just give the address of your repository and hit check. But in addition to just putting it in Packages, you're going to want Packages to automatically pull in the new versions of your library every time you release a new one. So in order to do that, you have to do a little bit of extra integration. Packages is going to warn you up at the top that says that auto-updating is not happening. If auto-updating isn't happening, then every time you release something, you have to go and click that green update button to bring the new versions in. But we're going to automate that. So you go over to your profile section and up at the top, if you click on your API token, it's going to show you this OAuth token, which is just a long set of characters. Copy that on your clipboard and then back in GitHub, in the integration and services section, you add a service, you find Packages either just by scrolling or typing Packages and click on it. You then put your Packages username in the user section, the token you got off of your profile page in the second field and down at the bottom, it must say HTTPS Packages.org. Then you add the service and come back to the webhook section, open your service and up at the top there's a little test service button. If you click on that and then go back to Packages, you'll see that the warning about the auto-update has disappeared. If the warning is still there, walk through the steps again, double-check your username, make sure you said HTTPS Packages.org you should be able to get it up and running with those steps. Some projects have plugins. Here's an example of Drush. Drush has plugins and if you write a Drush plugin and you give it a type of Drupal Drush, then that advertises that this is something that you can use with Drush. If you wanted to have something on your Readme that says, hey, here's a list of everything in Packages with this type, you can compose a link and drop that in your Readme. Really useful for your users. Earlier I showed an example of badges at the top of your Readme, which are nice way signs for users. There's a site called poser.pugx.org that will give you additional badges that you can drop in to your Packages site. It'll tell you things like how many times your projects have been downloaded, but the one on this page that I think is really useful is the license. I'll talk about license a little bit more later. I don't think that the vanity badges are really that useful to include on a project page. They mostly let people know how much your project is being used in continuous integration, and that number isn't really meaningful for much, but a lot of people like the vanity badges. Travis is just one of the many ways that you can run tests. The reason I really like Travis is it makes it a lot easier to test different versions of PHP. This is also possible in other systems, but as we will see in a moment, it's much easier to set up with Travis. If you set up a file called phpunit.xml.dist, then PHP is going to load your settings from this file. You can get ignore the php.xml file if you want. Your users could copy the phpunit.xml.dist into php.xml if they need to customize it, but for a minimal customization of running phpunit on Travis, all you need to do is specify your autoload file and where you want to keep your tests and what file name pattern you want to find them under, and then phpunit will run all of those files. I also like to use a project called Sqlizlabs phpcode snipper. This helps check for conformance of the PSR2 and if you just compose or require that, then it'll show up in your composer Jason and you can make a part of your tests. So here's the thing I was alluding to earlier, if you want to do multiple php version testing on some other CI system like Circle, then you have to prepare Docker images and things like that. It's kind of inconvenient. For Travis, it's a very simple matter of just listing all of the different versions of php you want to test and Travis will automatically cycle through those run the tests again after installing the version of php in the image. It makes it a lot easier to get good coverage across different versions. And then down at the bottom, in the script section after running phpunit we also run phpcs and the thing that I showed you earlier if someone submits a contribution and it doesn't follow PSR2 then the tests will fail and you know that right away. So that helps not just use a maintainer but it's also a good service for your contributors they don't have to wait for the maintainer to come back with just some nitpicky style feedback. It happens right away. Finally, Travis has this interesting feature where it runs a test as soon as a branch is created and it also runs a test as soon as your pull request is created and that means the same code gets tested twice because every pull request has a branch. So in the branches section if you're Travis Yemel if you add this funny regular expression down at the bottom and say I only want to test the master branch and branches that look like version numbers because in Travis tags and branches alike are applied on this branches section then your pull request will only be tested once and this will conserve valuable Travis resources strongly recommended. One thing to make your life really happy is if you add a cache directory Travis is going to keep this around and in particular the composer cache directory will greatly speed up your builds if you add that. I recommend committing your composer lock file if you're doing Travis testing and the reason for this is it really helps with what's called minmax testing so you can see in the matrix I have here there is a list of different PHP versions and in some of these we also define an environment variable depth and we either set it to highest or lowest or empty and then when we're setting up a project to test we're going to run composer install and by default if you don't have a depth variable set it's just going to run composer install with preferred disk which will quickly load whatever is in your composer lock file but if you have an independent library it's also interesting to know whether the code still works against your lowest described dependencies and also your highest described dependencies the lowest test is really important to make sure that no one has allowed any code that doesn't work with the lower software to creep in and the highest is really important to find out if any new releases from the programs that your library uses have come out that break your code so to do lowest we just run a composer update with a dash dash prefer lowest to do highest we just run a composer update instead of composer install in both of those cases that ignores the composer lock and brings in fresh dependencies so you get best test coverage many of you probably heard the rule of thumb that if you're making an application you commit composer lock if you're making a library you don't commit composer lock, why is that a rule the reason that's a rule is that if you don't commit composer lock then you get highest testing every time and if you're doing a library and you're only doing one test then I agree highest is the best to do but if you do the whole highest lowest you'll get even better testing and in that case you might as well commit your composer lock so that those middle tests that are just running off of your preferred versions run really fast and the other advantage of this is if you have some failure that falls into one of these categories it's really obvious just by looking at which tests fail what the cause of it is with just highest testing then suddenly all of your tests have failed and you have to scratch your head and wonder exactly why that happened it's usually because of highest anyway but with this it's a lot more clear just one of the tests is failing and all the others work so this is a dependency change and I didn't just accidentally commit something bad and look how good it gets the focus of this screen is on the lower right consolidation annotated command it's not a very big library it's pretty darn small but it's not teeny and if you're not using the techniques that I just described you're not going to get down to this sub one minute test time that we're seeing here so there's a big difference to do your your caching and things and only 40 seconds if we're not messing with the composer lock if you wanted to make a far you could do the hard work and look at the php far apis and start writing some code but there's a cool project called Kergy box and if you install that all you have to do is run box build and it's going to make a far for you if you take an existing box jason file which describes what your far should look like all you really have to modify is this part here that is highlighted in the middle and this is from the robo project by default you say which directories contain all of my sources you can add additional files that are outside of that location if you just want to cherry pick some things and then down at the bottom the finder have a better control over what goes in and what goes out so just a little bit of editing of this and then all of your contents will go into a far and it actually works right I have found that it's very easy using the php api to accidentally build a far that seems to mostly work but some of the far api functions for retrieving files just don't return anything and that's very frustrating and takes a long time to debug so if you use a tool you don't have to debug that travis also has a deployment feature in the deploy section of your travis yaml if you say provider releases and provide your oauth token for github then you can automatically take some contents of your build results such as your far file and it will get pushed up to github every time you have a release there's a little section where you can go to see all of your releases and it has tar balls that github makes for you anything else that you put in here will show up right next to those releases so if you're distributing a far that you want people to release you don't need to build it manually yourself and upload it to github every time you can just automate it as part of your tr part of your ci and this will happen every time you push a release which is to say a version I like to use a feature of composer called composer scripts if you make a script section inside of your composer jason file then you can add new commands that are available from the command line when you run them with composer so in the highlighted example if you run composer unit tests and in this particular example I define the environment variable shell and reactive equals true and this prevents a symphony from running code in a way that is destructive to the tests for this project so it's harder to describe necessarily all of the requirements and read me if we start getting into the habit of providing these standard rules like composer tests much like other projects do with make files then that makes it easier for people to just clone build and run your project without having to think twice about what they need to do or dig through your read me on the test script is at unit would that block if it fails at cs or would it go wrong I would have to test that I don't remember but to call it out in the square brackets if the items start with an at it recursively runs and those can also be shell commands when you have a list of things like that I don't remember if the list is interrupted by errors I think it is because it's going to return your status code so if you provide testing with Travis it's a good idea to put a badge on your read me if you go to your status images on the Travis website you can get in this example I'm showing image url if you wanted to put a badge on a file that was just html you could do that you can also switch image url to markdown to get the markdown if it's going into a readme.md but the image url is sometimes kind of interesting for a little hack we do excuse me at Pantheon we have some wiki pages and they just collect in a wiki table a whole bunch of lists of projects and then we paste the images of the status badges into the wiki page and anyone who is viewing the wiki page will then see the status of all of those projects so we have just made a dynamically updated status page just in a wiki which is kind of fun coveralls is a neat little service that will show you what your code coverage is like on your project it's easy to set up in your phpunit xml file you just need a logging section which tells phpunit produce this coverage information in the example shown on the screen there the unhighlighted line produces the clover xml that's used by coveralls the one that's commented out above it will generate html pages with your coverage so if you want to look at code coverage reports offline you can just uncomment that run your unit tests and take a look at them locally but if you put it online then other users who are evaluating your project can look at the coverage results and see how much code coverage you've got if you want to run the test coverage code you need xdebug install on a mac this is really easy now if you install php with xdebug if you're not on a mac you might want to use a vm and then you compose or require a satoshi php coveralls which is a little third party utility that helps upload your clover xml to coveralls and then you just add in a aftersuccess with this travis retry command and if you run vendor bin coveralls it will just put the results of your clover xml up on coveralls for you the result of oh sorry another step in coveralls itself you have to click on this little plus and then you can just move the little video slider to the on position and once you do that coveralls will start keeping records of your build results and then you can get a little graph over time that shows how your coverage goes up and down as you add tests and add extra features and things of that nature once you have all of this stuff in coveralls it has a nice little web user interface where you can take a look at the changes in any given pull request and you can also sort these by most to least covered so it gives you a nice place to find areas in your code that need a little more attention on your coverage and if you click on one of these files then it will give little shadings showing red for lines that aren't covered and green for lines that are in this example we have a file that doesn't have great coverage but it's a really small file and the uncovered lines are just providing some very trivial defaults so we can feel sort of good about this but 100% coverage is even better and this tool gives us the ability to make that decision about where we're going to spend our time the coverall badge is on the status page that shows coverage over time little embed section will give you the markdown to add to your readme page scrutinizer is a particularly favorite tool of mine there's a lot of various tools to try to make this claim of doing static analysis on your files but some of them produce fairly low value bits of advice and I found over the last year or so that scrutinizer if you really understand what it's looking at gives you some pretty good advice it analyzes code complexity and duplication and gives you some feedback in terms of bugs and hot spots will help you focus on where you might want to refactor it's easy to set up scrutinizer you just give it the name of your github repository and it's going to pull it in unlike some of the other services it doesn't give you a list but it's still fairly easy to get started and the other neat thing is on their website they have a little chrome plugin if you choose to you can install this plugin and then you'll get scrutinizer reports right inside of your pull requests so if you have someone submit something and scrutinizer comes back and says oh this is a bug then you can say oh I'd like you to fix this this is an example of an inspection report that I ran right after I fixed a given problem scrutinizer was complaining that I had a class called hook manager that was rated as an F because what it really focuses on is class size and complexity and I had gotten the complexity down it was still kind of a large class scrutinizer will give you advice about refactoring and it says sometimes if you have a class that's too large it's doing too many different things and in this case the hook manager was responsible both for managing the hooks and also for doing the dispatching so I factored out a base class for dispatching and implemented a separate dispatcher for every piece in the code that was doing dispatching and then left the rest in hook manager and after that scrutinizer said hey now hook manager looks good to me I'm going to move it from an A to an F and all of these little classes that I made were also nice so pretty useful thing and the way I found that is if you go to your code hotspots you can see on the left as a list of all of your worst rated classes and on the right what they call PHP operations which is to say just a method and you have a couple of different techniques you can use if you reduce your cyclometric complexity which is to say reduce the number of code paths through a method then your PHP operations will get better but of course doing that will make the classes longer and then eventually the class gets into an area where you're going to want to figure out how to split it up but if you've already done the complexity part then that also will sort of help you figure out how to split up the class there's another here it says how to fix complexity so in this case I've got a class and it says oh this is complex you might want to split it up it'll also draw these class diagrams that shows how different things relate to each other and I've highlighted this section here and this is an unfixed bug in some of my code so all of these methods that have to do with options and the other methods don't really talk to these there's not a lot of crosstalk so this is a place where I could create a subclass and factor that out of my complex class so sometimes any static analysis tool will give you advice that you don't like so just ignore it but I've found that if you recognize what the tool is you can use that and use it for that purpose and I've been pretty happy with what Scrutinizer does you can get a badge for Scrutinizer right under your overall score just paste the mark down into you read me as usual I'm just going to zoom through version I really quickly but some people have said that this is not the most important tool but what are we doing here we are creating open source software and there's a lot of different licenses in the world so I think we should try to pay attention at least a little bit in our Composer Jason we really want to put in explicitly the license that we allow right in the Composer Jason because this will allow other programs such as version I to automatically analyze here's an example of a version I output that is showing one bit of software that's out of date and in this case in this project it was a decision to not do that but the other interesting thing on this graph is there's a little Apache 2 license embedded in there and Apache 2 can be a red flag because if you use something that is licensed Apache 2 that's incompatible with GPL which makes for a dicey situation if you want to mix things up this is just a dev component here it's not used in the main software but if we click through on this link then we find out something that version I doesn't tell us very easily and that is that this component is actually dual licensed under Apache and GPL so if you're not using version I you can run a command and that will output in your terminal a list of all of the licenses of all of the software that you use so even if you don't bother to set up version I I strongly recommend that you occasionally run composer licenses on your PHP library to see how your compliance is doing and we can see down here that composers are a little smarter than version I it's telling us right in the output that this file is actually in GPL when I do set up version I what I like to do is I like to take the license badge from poser that I showed you earlier and then I edit it so instead of linking to packages I link it to the version I licenses table so if someone comes to my project and they click on the badge they don't have to download my project and run composer licenses they can see the output right from my project so read the docs is a really cool service that allows you to write documentation in markdown just like github and the result of that is a output that looks like this your markdown text is on the right and there's a nice table of contents on the left that people can navigate through and if you do your documentation in markdown files it's really easy to edit in github and it's really easy to update as part of pull requests so I recommend that so here's a neat little trick there is a project called markdown docs and if you run it it'll troll through all of your PHP files and it'll build a nice API doc in a single markdown file and you can just add that to read the docs now the problem with this is it's not automated there's no way that you can have this system automatically generate your documentation off of a pull request because read the docs is written in python and this markdown docs is written in PHP and it's hard to get those two things to play together so you have to make separate APIs and remember to update your documentation but this technique is really easy to get started but it's just unfortunately cumbersome that it's not automated so what if you want to automate you can move on to a more complex tool github pages allows you to serve static html directly from github out of a separate branch on travis I mean on github that you generate on travis on github you can launch the automatic page generator and it has a whole bunch of templates that you can just click on and it's one time it's going to splat down some html and it looks about like this the template that I picked and then with a little bit of css fiddling and cursing I made it look like this but mostly you're not going to want to do css fiddling and cursing if you can avoid it I don't really have time in this presentation to talk about all of the options that are available but you can add a whole bunch of different techniques like the mkdox.org is the markdown to html generator that read the docs uses and you can use that as part of your github pages process especially if you're moving from read the docs to a more complicated system Jekyll is really popular and Sculpin is a PHP project that does similar things and there's many more to keep you out of this hand css fiddling world once you're doing github pages then it's possible to automate your api generation and the api generator I like is called SAMI and this is part of the symphony project and it makes a very java doc like output from your PHP sources this slide is just showing you how to run SAMI I would recommend that you grab this from somewhere that's already doing it successfully like one of the consolidation projects and copy it with a little bit of customization then you can easily have your code run through SAMI I wasn't kind enough to give a screenshot for that okay this is also showing that you need to set up some environment variables to put your github token and your email address into the environment variable sections on Travis so that you have the rights to actually run SAMI and then push that back up to github when it runs and here is the diagram that I wanted to show you that shows you all of your classes on the left and you click on it and then it shows you all of the methods and those are all linked up this was used to make the symphony API doc so it's nice and clean and despite the length of that script I showed you a couple of slides back it's not really that hard to set up it's automated if you have a module that has a little bit of a medium to large size API I recommend taking the time to do that so that is the extent I wanted to go over today there's going to be some really cool contribution sprints coming up I hope you will stay for those the first time sprinters workshop is at 9am in room 307 we've got a mentored course also at 9am in rooms 301 through 303 the big sprint rooms you may have already been there and the general sprints are going to move to rooms 309 and 310 so thank you very much for coming please find the presentation on the slides session section and I'll take any questions anyone might have at this time please come to the microphone if you have a question are the slides available somewhere I'll certainly tweet it out and are you going they're all going to show up on the schedule on Drupalcon yeah they should be linked to the schedule very important to get on you don't like pulling up before you start to pump no no