 We are going to get all set up for our next speaker. I think you know our next speaker coming on here. Do you want to introduce him? Ed Thompson. Edward Thompson. He's ever in Cambridge and he's going to be doing a great session on creating your own open-source project using Azure, which is one of my favorite products, and Azure DevOps Pipelines, a brand new thing that we just got. All right. Let's take a look at the fly-in and transfer over to Ed. Uh-oh. Jeff, broke something. I broke something. It's me just waiting for Jeff to fix whatever he broke. There we go. There's the map. Here we come, Ed. There he is. There's the man. As you're muted going out to him. Your turn, buddy. You're ready to go. Hello.netCon. Good morning or good afternoon or good evening. It's the great thing about a virtual conference. You can join from wherever you are. I'm here in England today. My name is Edward Thompson, and I'd like to talk to you about the intersection of two of my favorite things, building your open-source project with Azure Pipelines. Going to share my slides with you. There we go. Why are these two of my favorite things? Well, let me tell you just a little bit about myself real quick. Again, my name is Edward Thompson. I'm a program manager at Microsoft. I work on the Azure DevOps team, and I focus on two main areas of that product. Azure Repos, which is our Git repository hosting, and Azure Pipelines, which is our continuous integration and continuous delivery platform. That's what I do during the day. But by night, I'm actually the maintainer of a couple of open-source projects. I maintain an open-source project called LibGit2 and another one called LibGit2-Sharp. LibGit2 is actually a C library that does Git repository management. It actually works on the low level of your Git repository. It's used by people like GitHub and Azure Repos to actually do the repository management. It's used by actually all the major Git hosting providers. Just because you haven't heard of LibGit2, and maybe you have, but just because you haven't, doesn't mean you're not using it. Every time you open a pull request, whether that's on GitHub or Azure Repos or really any other provider, you're using LibGit2. It's critically important to me that we have a very complete, robust, and reliable build and test platform. Because so many people, so many software engineers rely on LibGit2 on a day-to-day basis. So it's very important to me that we use something like Azure Pipelines to manage that. So I'll show you more about that in a minute. The other project I manage is called LibGit2-Sharp. LibGit2 is a C library. LibGit2-Sharp is the .NET library that sits on top of that. So we use that within Microsoft, so that we can make sure that we use the .NET version because many of our products are written in .NET, as you can probably imagine. So again, I'm a program manager for a product called Azure DevOps. You may not have heard of Azure DevOps before. It's new as of this week. Azure DevOps is the next evolution of a product that you may have heard of called Visual Studio Team Services. The great thing about Visual Studio Team Services was that we have all of these tools to help you manage your software project. So beginning with agile planning, taking the requirements, setting up the work that you're intending to do for the next sprint, get repository hosting to actually host your code, and then moving into a build and test pipeline to get your code into production, whether that's a Cloud offering like Azure or another Cloud provider, or onto your on-premises servers. So Visual Studio Team Services was this great set of products, but a lot of people are coming into software development with other tools. Perhaps you're using Jira for requirements management. Perhaps you're using GitHub to host your Git repositories. So the idea behind Azure DevOps is to take these tools and actually break them apart into individual components so that you can adopt just the tools that you need. If you're happy with GitHub, that's fine. That's great. You still need a proper CI CD pipeline. So you can adopt Azure Pipelines, which is the build and release portion of Azure DevOps. You can adopt Azure Pipelines by itself and plug that into your existing platform. So we're really excited about this transition into Azure DevOps. In particular, I'm really interested in Azure Pipelines. Again, that is the continuous integration, build and deployment aspect of Azure DevOps. Azure Pipelines is incredibly powerful because it enables any language on any platform. So we actually provide you with cloud-hosted build agents running Linux, Windows, and macOS. That's available for everyone, whether you're an enterprise project or an open-source project. But we have some really amazing offers for open-source developers. We actually provide unlimited build minutes for open-source projects. So you can use this to validate pull requests. You can use this to build the master branch in a continuous integration fashion. Every time something gets merged into master, you can perform a build. You can do that for free with unlimited build minutes across our cloud-hosted pool of machines. You get 10 pipelines in parallel. So it's actually the most generous offer for continuous integration for open-source right now. We're incredibly proud to be able to offer it, and we're hoping that open-source projects will take advantage of it. Again, I'm taking advantage of it with mine. I'm incredibly excited to be able to do so. I'll talk more about that in a minute though. The other thing that we have launched with Azure DevOps is a new integration with GitHub. We're actually in the GitHub Marketplace now. So you can adopt Azure Pipelines just by going to the GitHub Marketplace and clicking a few buttons. It's really super easy to get started. It's in fact so easy that we've seen a number of open-source projects adopt Azure Pipelines. So the.NET Foundation has made a push to adopt Azure Pipelines for many of the projects, and this is due in large part to Oren Navotny. So thanks Oren for helping out here. So we've seen reactive UI and reactive extensions are now using Azure Pipelines. Cake is using Azure Pipelines, the Cake build system. NuGet Package Explorer.NET Nuke, Visual Studio Code are all using Azure Pipelines, and I'm using it as well. I'm very excited about this because so many of these projects have such different requirements, that getting them all into Azure Pipelines was challenging, but also incredibly exciting and incredibly powerful. So why don't we take a quick look at what Azure Pipelines looks like and how easy it is to get started. So here I have my open-source project. This is LibGit2Sharp. Again, this is the.NET wrapper around LibGit2, and I just want to scroll down a little bit. Here we go. So here in my readme for LibGit2Sharp, you can see that I have a number of these build badges. This is the first one right here with the cute little rocket ship. That is Azure Pipelines. So I'm building the master branch. This is showing the results of the most recent merge into master for this project and that the build has succeeded. You can also see that I've got a couple of other badges here. So this is an important aspect of managing open-source projects. You might not want to rely only on a single continuous integration service, and that's totally fine. You don't need to get rid of whatever CI systems you already have. If you're using, say, App there or Travis already, you don't have to get rid of those if you want to think about adopting Azure Pipelines. You can just bring that into the mix as well. So for the philosophy behind this project right here, LibGit2Sharp, we want as much coverage as is possible. So we want to build and test on a number of platforms. So we have, before Azure Pipelines, we had just App there and Travis, two good systems, and we added Azure Pipelines to the mix to give us a third system. So if you already have a CI system for your project, I would still recommend giving Azure Pipelines a look, perhaps getting it set up. It's very easy to get started. It's free. So there's no reason not to do that. That just gives you another bit of coverage. So that is something that we do on LibGit2Sharp. That's also something that the cake build system does. If we go over to cake, this is their open-source project. It's hosted out on GitHub, as you would imagine. If we look at the pull requests, so for LibGit2Sharp, I was showing the build badge for master, showing the most recent build of the master branch would have been merged, and that's great. You always want to make sure that your master branch is building. That's an important consideration for contributors. But the other thing that you want to do for contribution is to build pull requests. So if I have somebody contributing to my project, let's say somebody finds a bug in my open-source project, or perhaps they want to add a new feature, my project doesn't do something that they want it to do. They can contribute a pull request with some code changes. You'll see that that's very common in an open-source workflow. You see that there are several pull requests for the cake project here. So these are all various people contributing changes back to the project. Let's go ahead and just click on one. Okay. So here we can of course look at the changes that are being proposed. It's one commit changing six files. But what we really want to do is scroll down, because here are checks on the pull request. So what we can integrate here is different continuous integration providers like Azure Pipelines to validate this pull request, to make sure that it builds, to make sure that the tests pass. So this is incredibly important to get a feel for the quality of a change of a pull request that's coming in. Cake actually has 11 checks running on their pull requests. Let's expand them just to take a look. Wow, this is really wild. If we scroll through here, we see a bunch of different Azure Pipelines builds. They're building on CentOS, Debian, Fedora, Mac, Ubuntu. We haven't even gotten to a Windows build yet. There's the Windows build. But they're also like Libgit too sharp, they're also not just using Azure Pipelines. They're also using a number of CI providers. Here's TeamCity, here's AppVare. The reason that the cake project uses so many different CI systems is because they actually have different extensions and different implementations for these different CI systems. Since they're used to build projects, they need to understand the context in which they're running. They may behave a little bit differently running in Azure Pipelines versus a different CI provider. Certainly, they may behave a little bit differently, depending on the platform. That's why you'll see Windows here, Ubuntu, Mac OS. The cake project is really taking advantage of everything that Azure Pipelines has to offer. This is really exciting. I'm really excited to see this. This is.NET Conf, of course. I do want to talk mostly about.NET projects, but it's worth pointing out that Azure Pipelines is not just for.NET projects. If you are working on projects that maybe have maybe it's a.NET back-end and maybe some different front-end, maybe you're using TypeScript in your front-end, you may want to integrate Azure Pipelines for all of these projects. I mentioned that I work on Libgit2-Sharp. Libgit2-Sharp is the.NET wrapper on top of a C library, Libgit2, and so we use Azure Pipelines on the Libgit2 side as well. It's perfectly workable to build your C applications as well as.NET. Of course, we can build anything, really Java, Ruby on Rails, Node.js, but here we have an example of an entire architecture, where you have.NET on top of a C library. Again, in my readme, I've got my build badges. You can see here, this cute little rocket ship. That is my Azure Pipelines build for Libgit2. Unlike Libgit2-Sharp, where we have multiple CI providers or Cake where they have multiple continuous integration builds set up. Libgit2 actually only has the one. We only use Azure Pipelines for Libgit2. We also have a system called Co-Verity set up. That is actually triggered by Azure Pipelines to do some static code analysis for us. But the builds themselves, this badge right here is all Azure Pipelines. If I click on that, we can actually see the most recent build into master. If I click on logs, what you'll actually see is the different platforms that we build on. You can see that there's a lot. Similar to Cake, we have a lot of targets for Libgit2. The reason that we adopted Azure Pipelines as our only CI provider was because it's the only one that will bring us Cloud Hosted, Mac OS, Windows, and Linux build agents that we don't have to manage. This was a big win for us to be able to just consolidate on a single CI provider that simplified our setup considerably. We're very excited to have that and you can see that we have a number of builds on a number of different architectures, Linux, Mac OS, Windows, with different compilers and different library options. That makes us very confident when we make a change to Libgit2. Again, there are a lot of developers using Libgit2 behind the scenes. It's very important for us to make sure that we are confident about the changes that are going in. We're a little bit conservative as a project, and we just want to make sure that we don't break anything. The other interesting thing about Azure Pipelines is not just that it supports Linux, Mac, and Windows, it actually does so by providing you virtual machines to do your builds. Some CI providers offer you Docker images in the Cloud, building inside a Docker container. Azure Pipelines actually provides you with virtual machines. The reason that's interesting is because you don't have to limit yourself necessarily to Linux, Mac, and Windows. If you wanted to, say, start some virtualization, you could run other architectures. Combining Docker with QEMU, the QEMU emulator, you are able to get started with perhaps other architectures. This is a pull request that I'm working on. You can see it's not done. You can see this big red X tells me not to merge anything because it's not building and the tests aren't passing. It's something I'm working on, which is to bring other architectures into our build pool. Here I've got ARM32 and ARM64. Here I've got PowerPC64. The power of Azure Pipelines is giving you a virtual machine that you can use for anything, including virtualizing other architectures. We're really excited to be able to expand our platform support even wider than what we expected. Azure Pipelines is incredibly powerful for that. Obviously, we're not quite there yet, but we're getting close. Those are three projects that are using Azure Pipelines. Why don't we take a look at how easy it is to actually get started for your project? Here we've got a very simple project. It's called HexDump. One of the things that I do, I spend a lot of time again working on low-level bits in Git repositories, and most of Git's metadata is actually text files. It's great. Maybe they're compressed, maybe they're in a different format. But some of them are not. Some of them are actually binary. I wrote this little app to be able to help me work with them. It's a.NET framework app that emulates the command line HexDump tool that comes with BSD. That's how I grew up. That's what I'm familiar with, and that's what I want to be able to analyze a binary file. I know that's crazy, but it's what I wanted. Moving over to Windows, I wanted to be able to run that. I built this little tool to help me do it. Again, very straightforward. It's just a handful of C-sharp classes, super straightforward. What it doesn't have is any sort of build associated with it right now. It's out on GitHub. It is open-sourced. I've got this license right here. It's the MIT license. Telling people that they can contribute. But what it doesn't have is any way for me to know, if somebody does want to open a pull request, maybe they want to emulate a slightly different mode, a slightly different output that they're more comfortable with. They could make those changes, send a pull request, and I would have no idea if it even built. I could eyeball it, but I might miss a typo. I could download it to my local machine and build it locally, test it locally. But what I really want to do is be able to see right from the GitHub pull request page what the status of that build is. It's very easy for me to get started with Azure Pipelines here to validate pull requests and to perform continuous integration builds on the master branch. I can get started right from the GitHub Marketplace. That's up here at the top in the black banner along GitHub.com. So I'm just going to click on Marketplace, and I'm going to scroll down to continuous integration inside that category on the left-hand side. So I'll click that and then I will scroll down to Azure Pipelines and I'll click on that. I'm already using Azure Pipelines for a number of my projects. You can see I've already purchased this, but I can get started by setting up a new plan. What I've done is I've actually got an organization set up on GitHub that I want to install this Marketplace app into. So again, totally free for open source projects, gives you all the information about that. Once I've read that, I can just click install for free, and I can change where I want to install it to. I've already installed it on my personal account. I've already installed it on Libgit2 as you saw, but I've got a number of other accounts and I'm going to select this one, etompson.net.com. There we go. I created this organization just for you all. So when I click complete order and begin installation, I need to select which repositories I want to start building. I want to do one at a time and we started out on that HexDump application. So I'll click install. I need to authenticate to GitHub to make sure that I'm allowing it to have access to that repository. Then Azure DevOps will set me up with a new project. I've of course got a few projects already, but I wanted to log in with a different Microsoft account. So I've never logged in to Azure DevOps with that account before. So what it's going to do is it's going to create me a whole new Azure DevOps project for my pipeline. You can see it's called etompson.net.com. So that's what a great name. So the first thing I need to do to get started building my repository is to select it. Again, I only gave Azure pipelines this access to that one repository. So that's the only one you see here. Had I selected all repositories earlier, you'd see the list. So I'll just click on that. Then Azure pipelines will actually analyze my project. It will look at it on GitHub and try to decide what kind of project is being built. In this case, it's a.net framework project. So it's recommending this set of build steps for me. If I were building a node.js project, it would look for maybe a package.json or something and suggest that it built with NPM. If I had a CMakeList file, then maybe it would try to build with CMake. So all of these smarts are built into Azure pipelines to know what kind of projects you might be building and suggest a template for you. So let me click on the.net template. You can see this is a YAML file that describes the build. It'll do some NuGet restoration, it'll run build and then it'll run tests. Very straightforward. If you're not familiar with YAML, this is basically a simple description of how the build should be performed. If you prefer like a nice graphical designer, you can have that too. Azure Pipelines does offer a wizard where you can click through and set up your build steps. I prefer this YAML configuration. The important thing here is that this YAML gets checked in to your repository, right alongside your code. So that's very, very important. It's an idea called configuration as code. So I think that we've all worked on projects where you're changing some code, you're changing some code and then you need to change the way that the build works to accommodate those changes. So you change your build definition. Then you change your build definition a little bit more. But sometimes you need to be able to go back to previous versions of the code. Maybe you need to patch a release that's already in production and give a customer a new build or deploy something. But you've made those changes to the build system. When you check in your build configuration, now when I go back to that previous version of the code, I'll go back to the same build definition that originally built that code. So it's incredibly important to have these two side-by-side, so that you can always know what and how the build needs to be performed for the particular code that you have checked out. Very, very powerful. So I very much encourage you to use this YAML configuration. So when I click Save and Run, this will actually check this in to my Git repository on GitHub. I could create a pull request and of course pull requests are always the best practice. It lets your collaborators see what you're doing and allow them to comment on it and to actually have a dry run of things before things actually get checked into the master branch. But hey, it's a demo. So I'm going to skip over best practices for now and commit it right into the master branch of my GitHub repository. When I do that, it will kick off a new build. So it's going to configure it. You can see this little cursor right here. It's actually uploading that change to GitHub. Now it is looking for a build agent. Again, we have hosted build agents in Azure for Linux, for Windows and Mac OS. So it's going to find one of those available virtual machines. It's going to reserve it for you and then it's going to start the build. It's going to, you can actually watch it. I love watching my builds, especially when I'm making some change that I'm really excited about. So when I'm about to land a really big pull request, I will just watch the build anxiously. So the first thing it does is gets the sources. So it's going to clone your repository from GitHub. Then it's going to do the new get restore, then it's going to run a build, then it's going to run a test. While it does that, I'm actually going to switch back to my GitHub repository. We can take a look while it's doing that build. It'll only take a minute, but we can take a look at the YAML file that it created. So I can actually just click on my repository right there, to go back to GitHub, and you can see again, the Azure Pipelines bot has set up Azure Pipelines. Pardon me. If I click on Commits, I can see exactly what it did. There we go. Again, this is the YAML file that we saw. We could have changed it while it was doing that. We could have made some custom steps, but I think it's easier to always start from a solid base using the template just to make sure that it works for you before you start customizing it. If we go back, let's take a look at how our build is doing. So here's the nice thing about the Azure Pipelines integration with GitHub, is that you get this rich build status integration right into GitHub itself. So I get this yellow circle that says the build is pending. If I click on that, it actually takes me to the check to show me more information about what's going on, and I can view more information on Azure Pipelines. So I can open that up and GitHub will take me right to the build that's in progress. It's still going. I actually have some tests that are being run. So you can see that the build succeeded. It's starting the tests and those all passed, and there we go. That's everything. So now that blue button has turned to green, indicating that my build was successful and all my tests pass. So now if I go back again to my project on GitHub, there's nothing really changed here. There's nothing indicating that that master branch build succeeded. I want to change that. I want to provide some indication for people who visit my repository that I'm using best practices by using a continuous integration build for my master branch and pull request validation as well. Again, the way that open-source projects commonly do that is by adding that build badge right here on the front page. You saw that with Libgit2-Sharp. You saw that with Libgit2. We can do that for even this silly little project of mine. I want to indicate to people that I really am following best practices. What I want to do is go back to Azure Pipelines. If I go to my builds view, this will show me all the builds that have succeeded, or rather all the builds that have been attempted for my project. There's only the one so far. Despite that, what I can do is click this Ellipses right here, and I can come down to status badge. All I have to do is click right there and it will show me exactly what I need to do to add this status badge to my repository. I can either link right to the image or I can actually get markdown. I can copy this right into my buffer, and then I can add this markdown to my readme. If I navigate back to my GitHub project, there we go. Now, I can navigate to my readme, and I can click Edit and paste my markdown right in. Great. I want to preview those changes just to make sure that I did it right. Let's hit Reload to try to get that image loading correctly. Try that again and again preview those changes. There we go. Took just a second to load that up. Once I'm happy with that, I can commit them to my repository. Great. Now, this time, I want to start a pull request. I'm going to go ahead and click Propose File Change instead of commit directly to master. Again, I want to use best practices as I'm trying to indicate to people that my project does use best practices. I want to actually do that. I'll click Create Pull Request. Now, the cool thing about this is that you'll actually see this field right here, load. There it is. What happens is that every time a new pull request is opened, GitHub communicates that information to Azure Pipelines. Azure Pipelines will start a build to validate this pull request. That's really great. This is a simple one, admittedly. Very straightforward, just adding this build badge. We can of course just view over to it and view the changes. There we go. Looks good to me. In this example, we might want to skip the continuous integration build. I'm going to go ahead and merge the pull request. There we go. Now, if I go to my projects front page and scroll down, there we go. We see that nice build badge telling us that our master branch builds are passing and our tests are passing for this project. That's great. But again, we're really worried about best practices here and maybe it wasn't so great that I just merged that pull request without actually letting the continuous integration build finish so that I knew that it was safe. Well, that's actually another very powerful integration between GitHub and Azure Pipelines is you can enforce that the builds complete for a pull request and in fact complete successfully and that the tests pass before a pull request can get merged. Let's take a look at that. If I go to the settings for this project and select branches, I can add what are called branch protection rules. Branch protection rules keep your master branch healthy. So I want to add a rule. I want to apply this rule to the master branch. I could actually use a wildcard here. I could apply it to all branches. But in this case, I really only want to apply it to the master branch. So I can require that pull requests are approved before emerging. But more interestingly, since it's only me on the project right now, I would be the only approver. So I don't want to add that one. What I do want to add is that status checks must succeed before a pull request can get merged. So I'm going to check that and then I could have a number of status checks. I only have the one so far and that is my Azure Pipelines build. So I want to check that that is now required. So what this does is it will enforce that the pull request cannot get merged until the build succeeds in Azure Pipelines and all the tests pass. So I can also click to require that branches are up to date. But the one I want to add is, there we go, include administrators. So that means even I, as an administrator of this project, can't merge the pull request until the build succeeds. So I'll click Create there. So now if I go back to my project, and let's actually change one of the tests. So the way this project works, I'm going to go ahead and edit it right here on GitHub. The way this project works is by opening a stream and writing it out in hexadecimal format. It's very straightforward. But one of the things that it does, that's a little bit interesting, is it allows you to skip to a certain part of the file. So it does that by using the standard stream.seq function. But there's something interesting about .NET Stream Class, and that is that it doesn't have to implement a seek. But since we only ever seek forwards, we can wrap a system stream with our own stream. We call it a forward-seeking stream, and we can just read those bytes and throw them away instead. It's not as efficient, but it does get the job done. So we have some tests here to make sure that we understand how that works. We create a simple buffer with five bytes in it. We seek forward by two bytes, we read three bytes, and then we validate that these last three bytes here are the last three bytes here. Very, very straightforward, right? So let's change this test so that it shouldn't succeed anymore, right? And I'm going to propose it. Sometimes as an open-source maintainer, you get changes, not a lot of context. So this is one of those. You look at it and you're like, well, I don't really understand why this change was made. That's very frustrating, right? So let's emulate that sort of a workload. So I'm going to go ahead and create a new branch. You notice that this is actually locked out for me. I can't commit straight to master. Even as an administrator, I can't commit straight to master. Certainly the people contributing to my project wouldn't be able to either. So they would be taken down the pull request workflow. So we'll create that pull request. And again, just like before, you'll see that the status checks go here. The yellow circle indicates that they're being executed. So it's talked to Azure Pipelines, starting that build. And if I click on details, it'll take me to more information about the check, and I can click to open in Azure Pipelines. Because again, I really do just get a kick out of watching this build output scroll by, okay? So that'll take just a minute to do the new get restore, to do the build, and then to run the tests. Okay. Hey, Edward. Well, it's doing that. Hey, Edward. One sec. Yes. Hey, we have a couple of questions in the chat room. I wanted to see if we could relay and get your feedback on. Yeah, absolutely. So you've shown us some simple examples here of YAML configuration. Are there more complex examples available somewhere else out there on GitHub? Maybe that we can refer folks to later when they have more complex solutions they need to check out. Yeah, I would go to the... So the best way, the thing that I like to do is to actually look at examples of real life builds. For example, you can pull up libget2. Again, we've got a number of architectures, and we're using this YAML right here. And so it's doing some interesting things like using templates and running a mix of Docker and actual on host builds. So I would actually look through the examples in those repositories like VS Code, like libget2, that I linked to in the beginning and try to get an example of that to see what they're doing. I would also look at the documentation that's on the... I don't have the link in front of me, but you can bing it for Azure, Pipelines, YAML. Cool, that's good. Very cool. And so when you are targeting the various Linux platforms, right when you're targeting Linux, is there a way to specify the different Linux flavors that you wanna test against? There is. So out of the box, we just give you an Ubuntu build host. And you saw like with Cake, they have all of these various Linux distributions that they're using, CentOS, Debian, Fedora. The way they're doing that is running them inside a Docker container. That's what we do with libget2 as well. So we actually have Docker images that we have set up. And so our build is actually just running inside a Docker image. It's totally flexible and allows you to customize the environment to exactly what you want, what you expect. So that's really cool that you're still running the same process, but they're choosing the different operating system just by choosing an appropriate Docker container to go along with that. So yeah, that's exactly right. So is it possible to run Pipelines locally like Jenkins or is this strictly an Azure thing? So yeah, what I don't have in front of me unfortunately is my handy Raspberry Pi. So I mentioned that we want to have a number of platforms running for various things. And so we can look at some on-premises build agents. It's a .NET Core application, our build agent is. So anywhere .NET Core runs, and that's a lot of places you can run our build agent. So I started porting it over to Raspberry Pi, for instance. So I have a little Raspberry Pi that I can plug in. I can set up a build agent. I don't have to use these Azure hosted ones. I can set up my own build pool with on-premises machines. It's especially important if you have really complicated dependencies that maybe you need inside your firewall. Totally happens, I get it. And so that's a way to do that. You can actually mix and match as well. You can have some hosted Azure hosted build agents that we provide and some on-premises build agents that you use. Very cool. And then last question that we have here. So typically when people see the concept of continuous integration, they're thinking unit tests. Are we also considering being able to run pipelines with integration tests so that you can get that richer bit of feedback as the build happens? Ooh, interesting. So I always struggle with the definition of an integration test. And because I feel like these terms mean various things to various people, or it might just be that I'm really ignorant about it. I'm not, or both, it could be both. So yes, I think that one of the things that you can do is you can leverage the Azure test plans. Azure test plans are another part of Azure DevOps that allow for richer test integration and even manual testing. The other thing that you can do is add gates within your pipeline. So maybe it's not something you would wanna do as part of the build, but you would start to think about having release gates as you were getting ready to deploy an application. And so you can actually stop the, or pause rather, the deployment at a gate and wait for something to satisfy that because that could be a manual test. So maybe you deploy to a staging server and then you have a gate that somebody has to approve manually to move from staging to production. You can also have automatic gates. So they could run an application, they could hit a rest endpoint, they could run an Azure function. And we've seen some really clever things there. Somebody set up Twitter sentiment analysis. So you can start staging a deployment, you can roll it out slowly. If you're seeing on Twitter that you're blowing up with people saying, oh, what's going on? Everything's slow. Then you can actually stop that gate to keep it from deploying further. So you get sort of ringed deployment or canary deployment with some sort of automation. It's incredibly clever. That is so cool. All right, I'll let you go and you can wrap up here in the next minute or two. And we'll get ready for our next speaker. So let's see if that build is done. Back to my pull request. Oh, indeed, there was a build failure, which we expected. We changed that test to be totally invalid. Nothing wrong with that. This absolutely happens when you get started contributing to an open source project. Sometimes things aren't exactly as you expect them to be. Sometimes code is dense, maybe it's not commented very well. I know I'm guilty of that sometimes when I work on open source. So if somebody were to come in and pick this up, they might not exactly understand what I had in mind because I hadn't commented it very well, right? So it's totally valid that your first contribution to an open source project might not build and tests might not pass. And this is an example of that. But the nice thing is that the combination of GitHub and Azure Pipelines has notified me of that as a maintainer. And I can't even click that merge pull request button because that we required that that status must pass. The build must actually succeed before we were able to merge it. So we're able to take these two very powerful systems and tie them together. So again, Azure Pipelines is part of the Azure DevOps family of products. We're very excited about these. They are the next step of evolution for software developer tools from Microsoft. Despite the name, it's not DevOps in a box. Adopting this won't make you suddenly successful with a DevOps workflow, but it will provide you the tools to help you get there. As my buddy Donovan says, DevOps is the people process and products to deliver continuous delivery of value to our customers. We can help you with the products. So we're very excited about this. Thank you so much for taking the time. Thank you to .NET Conf. Very excited to be here. If you have any questions, feel free to get in touch. I'm at etompson on Twitter. That's really the best way, ethomson. So thanks again. Awesome, thanks so much, Ed. It was really great to have you join us as part of the event. Yeah, absolutely, make sure you reach out to them if you have any questions. Of course, this video will be archived. It'll be available over on YouTube later. And we're gonna be rerunning all of .NET Conf all weekend long. So thank you so much for joining us. That was really great stuff. That was awesome, Ed.