 Hello, friends of .NET. I'm Emil Landworth, and you can find me on Twitter at TerraJobsd, where it seems to me my full-time job these days to give advice on how to build .NET class libraries. So we thought it might be useful to do a session on how you can build great class libraries using .NET standard. And .NET standard itself is not necessarily the higher order bit here. It's more like the general approach for building class libraries. So a good chunk of my session today will be tips and tricks on how to make your life easier when you're building class libraries. So let's get started with the overview of what is .NET standard, because it still seems a lot of confusion about what the standard actually is, how it works, and why we have so many different things called .NET something. So first of all, as the name kind of implies it, the standard is a specification. And by specification what we really mean is a set of APIs. But it's not just that. It's also a template you can use to build class libraries. You can either do this with Visual Studio, or you can do it with the .NET CLI. We just say .NET new class library, and we default you to .NET standard. And what that means is you're building a class library that is restricted to the API set that is just in .NET standard. So you cannot actually call any operating system specific APIs or framework specific APIs. And that's useful because it means that you know that the resulting class libraries will truly work everywhere. So another way to look at this is that the standard is something like the HTML specification. So it has a whole bunch of language constructs and other things in it that allow you to express your website, right? And then you have actual browsers, in our case it's .NET Core, .NET Framework Xamarin that can run your class libraries in the context of an application. So that's roughly how you should think about that. Another way to look at this is a term that we often use as verticals. You can think of the .NET platforms, the framework core and Xamarin as verticals. And they're vertical because they give you an end-to-end experience with an application model, a UI framework very often. And that is where you build your applications in. And then the standard is the orthogonal thing to that which is the horizontal. So it covers all the APIs and all the foundational pieces that allow you to express, for example, business logic and general purpose libraries like JSON parser and other things and allow them to run truly everywhere. So what's in the standard? The font is too small, it doesn't really matter. The real point here is there's a lot of technology in it. And to summarize it, it is all the foundational pieces that you already know from .NET. So there's really not any new concepts in it usually. It is just things you already know like I.O., collections, networking, all the things you expect to have when you want to build reusable libraries that run everywhere. And with .NET standard tool we have about, you know, 40,000 APIs or something like that. So it's very, very rich compared to let's say portable class libraries and other experiences we had in the past to write reusable libraries. So it is truly a really cool thing to use. Another question that I very often get is version numbers. Like how many versions do we have in .NET standard and where do they run and all of that? Point is don't worry about it. You should really just start with .NET standard 2.0. It is the best trade-off between reach and number of APIs that you have available to you. So if you're new to .NET standard and you don't know how to select the version number, don't worry about it. Just start with 2.0 and you're pretty much good to go. So another thing that always comes up is the standard is open source. Most of you who have followed .NET for a while now know that pretty much everything we do these days is open source and the standard is no different from that. The only thing that makes this line slightly different is that we also have a review board and the review board has representatives from all the .NET platform owners as well as from the community which is represented by the .NET foundation. And anybody can propose APIs to be added to the standard. The only two things we require is that the API already shipped somewhere in a stable form. So the .NET standard review body is not an API design review body. It's not about building new technologies. It's more about deciding which technologies should be in the standard and become universal. So it is more about the latter, less about the former. So if you want to build APIs you should just go to mono or .NET core or wherever concrete platforms you have with an open source process where you can actually engage with the developers and design APIs. The other thing we require is that one of the people on the review board sponsor the API and shepherd it through the whole approval process. And if you're curious there's a next version of the standard that is currently in planning. It's 2.1 as you might expect from the version number that we just took 2.0. So now the next one is 2.1 and there's a planning doc on GitHub that you can read and there's full requests with API additions. So if you want to be involved with the standard that is where you can do that. All right. So before we go into the demo on how we build .NET standard libraries I want to talk a little bit about platform specific APIs and how that works with the standard because there's a lot of confusion around that subject. So first of all we announced the Windows compatibility pack almost a year ago. Actually exactly a year ago I think at this point. And it's effectively all the things we have in .NET framework that were not in .NET core. And so it's all the usual suspects like drawing, ODBC, the registry, a bunch of stuff. It's about 21,000 APIs so it's quite sizable. Some people asked me what is that? No Linux compact pack and no Mac compact pack. And the reason for that is pretty simple that .NET just started on Windows. So the vast majority of platform specific APIs we have happen to be Windows specific APIs. Moving forward I could totally see a world that we also have a Linux compact pack or a Mac compact pack where we bring in more technologies that are not specific to Windows but to other platforms. So you can reference the compact pack from .NET standard. Now the concern is what the standard is supposed to be for libraries that run everywhere. What happens now when you use APIs that are not available everywhere? What's the experience going to be? Well first of all the experience will be that the application will crash with a platform not supported exception. So the expectation is that you as a library author either decide that you don't support your library outside of Windows or that you guard your calls with an if statement that says if you're running on Windows call the API otherwise don't call the API. So problem that developers always have is now well how do I know how bad it is? How many times do I call those APIs? Do I even know it's a Windows specific API? And so we have an analyzer for that that I will demo in more detail but basically what you get in the IDE is you get squigglies that will tell you that you call an API that is not supported. So you get a warning and then you have to do something about that. And it's not just in the IDE it also happens on the command line. So even if you're building truly on the command line and don't use an IDE at all we also got you covered there. All right let's talk about building a .NET centered library. So I have a demo project here. The demo project is a pretty straightforward solution. It's a .NET framework console application and it's a .NET framework class library right now and it's a pretty simple class library. All it does is it is a logger basically an abstraction of a login framework and it allows you to log statements out. I think pretty much everybody who builds libraries or applications for living has built something like that at some point in their career. So if I run the application nothing spectacular happens the application just locks something and the login file is written somewhere so not too spectacular. Now let's say we want to use this login library on .NET core. So right now this thing is targeting .NET framework in our case 4.61 but it could be 4.5 if it doesn't really matter. Whatever the case might be. So how would you convert this thing to the standard? And so the first thing you should be doing is if you have a packages.config file is migrate this packages.config file to let me actually change one setting here first to interesting let me check one thing here. Yeah, here we go. So if you have this tool now migrate packages.config to packet reference you don't really have to do it by hand. All you have to do is invoke this wizard here and what it will do is it will tell you whether the package can be converted to packet references and if so it will just do it for you. And packet references are the way how we do things moving forward in general. So even if you stay on .NET framework you should generally move to packet references as a better way to reference new packages. So now we have this project still targeting .NET framework and so the next step will be to convert it to .NET standard. So the easiest way to do that is to just create a new .NET standard project. I don't even care what it's named because the only reason I'm creating this is so that I can edit it and get this template here and then we'll go to my .NET framework project, click unload, click edit and now I will replace the contents of this file with this new thing here. And now comes the time we actually have to read some XML and have to be aware of how MS build works. Most of the things here are things that are just default settings. You can safely ignore them. References are defaulted on .NET standards. You don't have to worry about them as well. CS files are included by default. Everything in the directory so you don't have to worry about those either. So the thing you usually have to worry about are packet references or project references and you want to, you know, cut those guys out and put them in here and then you can basically delete all the rest. And then you now have basically a new project and now if we click reload, I may have done something bad. Let me edit the product again. Oh, yeah. A good problem is you need to be careful when you copy XML snippets that you stay consistent. So package references have to be in item groups in MS build. So let's quickly fix that. So it's my first sort of demo fail. Brace yourself. There might be more because it's pretty demo heavy. Now we can remove this project because we're not going to need it. And now when we look at the properties of this thing here, this thing now targets.NET standard 2.0. That's what we believe it for now because that's what gives us the best API set. Now if we rebuild this thing, I will get a bunch of build breaks. The first ones you run into are duplicate attributes because they used to be written to a file by the project template. Now they're generated during the build. So in my case I would just delete them because I don't need them. I would just use the ones generated by the build. And now we get our first actual error from the porting exercise which is you're calling registry APIs. Registry API is not available in the standard. So what do you do? Well, generally speaking, when you encounter these, my recommendation is you just start with going to NuGet and you just start with installing the entire compatibility pack because that avoids you having to hunt down individual packages. Long term, you should probably replace them with the individual packages. So just adding the registry, for example, because that removes the number of things you take a dependency on by accident. So in our case we just install everything for now and let's try to rebuild this guy now. Now we have zero errors and if I run my application, the application continues to work just fine. But now you might actually edit on a core application that targets Linux and you might use the library and then you're disappointed when you reach this line here that no blows up with platform not supported. So how would you know about this upfront? Well, you would install these other packages that we have, which is our analyzer package. Let's do that really quick. And now you can see you get these small squiggles in the IDE and it tells you here that you're calling an API that's not supported on Linux or Mac OS. And so the nice thing is it's also an error or should say a warning on the command line. If you don't like these things to be warnings, you can also make them actual errors by just going here and say this API is not supported on our platforms. You can say I want this to be an error. I mean, you rebuild. Now you actually get built errors when you use platform-specific APIs. How would you fix that? Generally speaking, my recommendation would be you either refactor your code to literally remove the dependency if that's possible, which in our case we actually could. Or if you want to give your Windows developer a slightly better experience, what you also can do is just instead of calling the API like this, you guard the API call, which in our case we do with runtime information either running on Windows. Let me first be built. I have to install. I have to add the using statement. Once I do that, now I still get the errors, but now I know it's safe. So what I can now do is I can invoke the quick fixer for the analyzer to say suppress this particular thing here because I know I guarded the call correctly. And now you basically have code that compiles again and you know it's safe to be used on Linux because in our case, if we don't run on Windows, we just default the setting to something sane and then we just write the log file to that location. All right. So the next thing you probably want to do once you have it done in a standard library is you want to package it up as a NuGet package. Historically, this has been quite of a rocket science thing to do. You have to read on how to build NuGet packages, create a new spec file, call NuGet pack, figure out how to invoke it and you build at the right time and all of that. You don't want to have to do that. You can just say right click properties package. All you have to do now is say generate NuGet package on build. And when you do that, let's save the project, build our project and then we go into the output folder. Now we see here we have a Fabricam logging package here and that's a NuGet package and it just contains the .NET standard DLL in it. So power of a checkbox, boom, you got a NuGet package. So the next question then is often great. The Compat Pack is out there. I get access to Windows specific APIs, but what if you have to tall APIs that are not part of the Windows Compat Pack yet? What do I do? Or what happens if I'm, you know, James and I have to write a wrapper around it? Say the iOS APIs or the Android APIs. What would I do in that case? And if a demo for that as well, let me just quickly switch to that demo, demo two. So in my case, I have a very simple, let me first rebuild this to make sure it actually builds. It's always a good start to start a demo with something that works. Sweet, builds. So I have an application here, let me just quickly launch it. I have, I should have two applications actually. I have a .NET framework, WinForms app that basically shows me my GPS coordinates and as you can tell I'm clearly a forms designer because I didn't even bother to name anything here. But the point is that I have the same thing as well for UWP. So if I say, so to start a project, I click deploy. I can launch that as well. And once it's done, yep. And now I have UWP app that also shows me the GPS position. And what I do here is I have a .NET framework library. As you can see by the names, I have a .NET framework library and I have UWP library. And they are basically just encapsulating what do you have to do to get the GPS coordinates. And on the .NET framework site, I would use system.device.location. And we ship the API I think in 3540 maybe. So this was before we actually had Async APIs. So these are all basically blocking the UI thread. So I put this all into a task, run this on a worker thread. Then I have to do some dance around waiting for the device to get ready and blah, blah, blah. And then at some point I get a location. But at the end of the day, all I have to return is a latitude and a longitude. And so it would be nice if I could just wrap this as a library and use it from all my applications. The problem is that on the UWP side of the house, the API task looks exactly the same. It's get coordinates, latitude, longitude. Ignore all this quickly. See, I don't know what's going on here. Probably I'm in a bad build of Visual Studio. It's the pleasure of dock footing I guess. But this looks very different. So this the APIs are relatively recent and so they've already shipped async from the get go. So they already have an async way to call the APIs and you don't have to do any crazy dances here. You just extract the two values and you return them. The point being is that this signature here looks exactly the same as this signature here. So it would be nice if I could write a .NET standard library and wrap this. And through the magic of switching branches, because I'm a PM, I cannot actually write code as it turns out. I just switch to my other branch. And now I have a single file. Let me just build this guy first. Yes, let me run package restore, then rebuild. Yes, let me do the usual dance. When you do some crazy gymnastics with your project file, sometimes you have to restart the IDE. Interesting. Yeah, ignore the error that might be me doing crazy stuff before. Point being is I now have two projects still by both applications and I have my class library project here. If you look at that my class library project, what it does is it has the same API signature and then we use conditional compilation. So we're saying if you're building for .NET framework 461, then do what we did in the .NET framework project before. If UWP, then we do this other thing. And otherwise we throw up from not supported. And in the editor, you get this very helpful thing over here where you can switch the context. So this is my view I get when it's .NET standard. This is the view I would get from 461. So you can see that this is now all grayed out. This is all in. So you basically can toggle your intelligence for the relevant platform and you basically get the right data and intelligence. Now, how does this magic work? Well, if you actually look at the project file, instead of saying target framework, it says target frameworks and that enables you to put in multiple frameworks semi-colon separated. And what happens now under the covers is that this project is built three times. One for .NET standard, one for .NET framework and one for UWP. And then I can just do conditions and say, for example, if I'm targeting for 461, then include this reference here. Or if you're building anything else, just include this NuGet package. Or could also add a condition on the NuGet package itself if I wanted to. So this way you can effectively encapsulate all the different platform specific stuff from your library. Now, the other nice thing I can do, I can again, under a package, I can say, give me a NuGet package. So if I built the project now, yeah, which in my case doesn't work because it doesn't build. Let's try this. Maybe this project builds. What I can do here is in the bin debug folder now you have sub folders, right? You now have one folder per TFM because we produce multiple different binaries. But what we also have is we have this NuGet package here. And if you look into this NuGet package, you will now see that this NuGet package now has one binary per platform. So from your consumer standpoint, they don't have to know that you did anything crazy, right? From that point of view, it's just one library and you just get different implementations based on the operating system you're running it from. And so this is exactly what James is doing in Xamarin Essentials to give you the experience where you can, you know, have one API surface that bridges effectively n number of devices. So you can effectively bridge any platform differences yourself by providing these abstractions. All right, talked about that one. So now let's talk about best practices when you do this. So the first thing I already said earlier is that when you have no idea what to do, you should always start with a project that just targets the .NET standard. It's the easiest way and if that doesn't work for you, then is the point where you actually change your project file, make it target frameworks, anti-semi-colon separated lists and all of that. And then you actually deal with the more crazy stuff. But .NET framework, .NET standard 2 should get you started pretty well. Now, once you want to call platform-specific APIs, then is the time to consider adding framework-specific implementations. But even if you do that, you should never drop support for the standard. Because if you do that, that means your package can no longer be installed into .NET standard project. Which no means not only have you been forced to provide platform-specific code, all your consumers would have to do this now as well, which kind of defeats the point. Because your point of this library is so that you can encapsulate this so your consumers don't have to do this crazy stuff. So never drop support for the standard. The other thing is, well, what happens in this case is when you have APIs that throw. My general recommendation is also add a capability API that returns whether the API is supported. So in my case, for example, let me just quickly show you what I mean by that. In this class here, what I could do is I could just create a boolean property is supported, it returns true for .NET framework and Windows UWP, and returns false for everything else. So this way, your consumers can check upfront whether the code will explode so that the writing the platform check is easier. And generally speaking, you should always use a NuGet package in those cases, because it means your consumers don't have to know which of the five binaries they have to pick at any given point in time. It's just automated for them. And then lastly, if you do multi-targeting, I highly recommend you check out Orin DuVontny's amazing MS-Build SDK Extras NuGet package, which basically does most of the heavy lifting that you have to do in order to reach UWP and Xamarin from a multi-targeted project. So it's a really neat thing to check out. All right, now let's talk about the elephant in the room. When I asked yesterday on Twitter what you would like me to talk about, one thing that popped up a lot was what's up with .NET framework and .NET standard? There seems to be some issues. So let me start with apologizing and say, I'm deeply, deeply sorry. We had a good idea and we tried really hard. Unfortunately, some things don't turn out to work super well. So when we shipped .NET standard 2.0, that was roughly, I want to say, six months after we shipped .NET framework 461, I believe. And so what we set at the design point of 2.0 was it would be really neat if .NET framework 461 could support .NET standard 2.0. The reason being is that most of the APIs, if not all of the APIs that were new in .NET standard 2.0 were already available in .NET framework 461. So the only thing we thought we have to do is change the NuGet mapping to consider 461 as implementing that standard 2.0. Unfortunately, we found out over the last, you know, six to 12 months that there's a very long tail of issues. And I don't have enough slides to tell you what the problems are. Generally speaking, what you should take away from this is that if you want to consume anything higher than .NET standard 1.5, you should probably be in .NET framework 472. Now, I realize that that is not always a choice you have, right? If you're a library author, you don't control what your customers are using. So one thing that you can do as a library author is do multi-targeting and add the .NET framework 461 output to your library so you have a basic producer binary that is specific to .NET framework so your customers don't have that problem. You still compile the same code. You don't have to change your source code at all. But you just basically produce two binaries as opposed to just one binary. And as an application author, you should upgrade to 472 if you consume .NET standard 2.0 binaries because the experience will be so much better. So again, I'm very sorry. For us, the major learning here is that after .NET platform shipped, it will never, ever change the mapping which .NET standard version this platform is supposed to implement because bad things happen if we do that. All right, strong naming. So strong naming is a topic that has triggered a bunch of strong opinions, I would say. So before we go into them, I would like to summarize what it actually is and why you should care about this. So strong naming got invented at the same time .NET was originally invented. So you can think of it as a key signing mechanism that gives you a library and additional piece of the name. So basically, in this case, what we have here is we have... Oopsie-daisy. Let me actually zoom in here. Something reasonable. Never mind. Anyway, so that gives an assembly name that you have and basically has the public key token on the right-hand side. And the public key token is the thing that you basically get from strong naming your assemblies. You should not confuse strong naming, by the way, with Authenticode signing. Authenticode signing is an industry standard that is applicable to many technologies, not just .NET. It's also applicable to native code. And basically what it does it just gives a certificate signature to the code and it indicates who authored the source code, whether it was Microsoft or Google or Facebook or whatever. Strong naming doesn't do that. Strong naming is really just, you can think of it as a GUID attached to your assembly name and it was used to locate assemblies in the GAC and, you know, a low side-by-side loading and blah, blah, blah. And so that was the thing where it was originally invented. Now, there's two major aspects of strong naming that you should be aware of. One of them is it's viral. If your library is strong name signed, you can only reference things that are strong name signed. The second problem is removing or adding a strong name is a binary breaking change because it changes the identity of your library. So if you ship a library, you need to decide upfront whether you ever wanted strong name signers or not. There are some open source libraries that went the route where they published two NuGet packages, one signed, one unsigned. I don't recommend that because it basically forks the ecosystem. You don't have to decide which graph you want to depend on and it's generally not a good experience. We generally recommend that, you know, if you build an open source library that you expect to work on .NET Framework, you should strong name sign it from day one. Now, people always say, oh, then I have to maintain keys and all of that. And the recommendation that we now have is just check in the public and private key. The private key should be checked in. And why is that? Because we don't want to care about strong naming for security reasons. We've set this guidance for over 10 years now. Don't use strong naming to make security decisions. So checking in the private key is fine. It's literally just a good attach to the name. That's how you should think about strong naming. And that's basically all that is to say about strong naming. The other topic that always triggers people, you know, sending me almost like death threats, that would say it's like binding redirects. And that's, I think the only thing really where I would say, if I had a time machine that could go back to .NET V1, that's the one thing I would probably change. So binding redirects, first of all, is .NET Framework only. So any other .NET platform that has shipped since then, you know, Silverlight, .NET Core, Mono, UWP, Unity, none of these have problems with binding redirects. So if you complain about binding redirects, rest assured it's a thing that you never have to deal with unless you target .NET Framework. And basically what it does is, .NET Framework assembly loader is very picky. If you build for 1.00 and you deploy a 101 or a 11 or a 2.0, it doesn't matter. If the version numbers don't match, it will reject the load by default. Now you can tell the system to say, yep, a higher version is fine, but in order to do that, you have to write this XML island here. And of course, that is pretty annoying. So let's talk about how we can make this easier. So I have a demo for that as well. So I have a library, I have a very simple thing here that just consumes our, let me actually rebuild this guy first. We store new packages. So I have this console app here that consumes our fabric and logging library we showed earlier. And it's very innocent. It just, you know, uses the logging library locks and that's it. So if you run this guy, it just works, locks out, everything's fine. Now we do something fairly innocent. We just say, you know what? I would like to install json.net. So let me go to the new weather org feed, select json.net. Yep, let's say stable version is fine. Install. And now I control the 5D app. And now you get this little thing here. Couldn't load file or assembly Microsoft JSON version nine. And if you read further on, it says the located assembly manifest definition does not match the assembly reference. So let me show you what that means. If we go to the output folder here of the application, let me launch I'll spy. If I could figure out how to do it. Let me kill everything that's in here and just bring in the ones we care about. So our demo and then our, our logging library and json.net which we can find under end, use of JSON. So when we look at the, at the logging library here, the logging library has a reference to Newton soft json. But what it references here is the nine O version. If we look at our demo application that also references Newton soft json, actually doesn't right now, but point being is that we deploy a Newton soft json, but the version we deploy is the 11 O version as you can see here. And so nine O was referenced, 11 O was deployed. So.net says by default, no, no, no, they're not the same thing. So I'm not going to load that. Now you can manually hack your app config file to include this binding redirect. I would not recommend that because, you know, XML is ugly and hard to get right. So what you can do instead is you can do it to properties and we have this new fancy thing here. All the generate binding redirects. All you have to do is check that box. By the way, that checkbox is set by default if you target anything higher than .net framework 451 in the template that is. So it should be on by default for most people. If it's not, it's just a checkbox away. Now if I built this guy, let me do one thing as well. Let me retarget this app to 472. The reason is, I told you earlier, we want to have it better for .net standard. We store new packages again, let me build again. Now the output folder is a little bit cleaner as well. Well, not really much, but a little bit cleaner, I guess. And now if you run the application, it just works fine. And if you look into this application here, the config file that we deployed actually has the binding redirect for JSON 11.0. But it's fully automatic. So that means if I go back here and change my mind on which version of JSON that I want to use, if I now say, you know what, I actually want to use the 10.0 version and you rebuild, then you see that the binding redirect is now updated to 10.0. So basically that's the reason why I should use automatic binding redirects on .NET Framework because you don't have to deal with that. And even better, if you target anything but .NET Framework, you'd never have to deal with binding redirects because the runtime just will accept any version number that is higher than what you referenced by default. All right, versioning. So versioning is this thing that people always are confused about as well. There are effectively four version numbers that you should be aware of as a library author. There's the package version, which is the version of your NuGet package. There's the assembly version, which is the version number of the DLL itself that .NET sees. There's the file version and there's the informational version. The first two, the package version and the assembly version are .NET concepts. They only make sense when you build .NET libraries. The other two are also Windows concepts. So any Windows PE file has a file version in it. And what usually ends up happening is when you install binaries, setups will also consider the file version when they decide which file is newer than which other file. So as a rule of thumb, package version you have to update with any change because only one version can be on NuGet Org. So we need to increment that anyway. File version is the thing you should increment every single time you make a change because otherwise when some people deploy your things with setups, they may get funky behavior if the version stays the same. On the assembly version, you have choice. You can basically say, screw it. I only have 1.0. We'll never touch this ever again. So I don't have to leave a binary on .NET framework. I generally recommend against that. I would just say for your own sanity, I would tie assembly version and file version and keep them in lock step and automatically update them every single time you make a change. The informational version is a string. You can decide whatever you want to put there. Like git, shell, one hashes are usually a good thing to put there because then you can tell by just opening the file. In Windows Explorer, right clicking properties, you can see where the file came from. But that is tedious. So let's talk about ways you can automate that. So for this, I also have a demo which will be the thing I will say today quite often. So we have our login library again, like nothing magic here. So now let's actually talk about version numbers. So you can go to properties here and you can decide what version numbers you want to put in here. Let me actually reset this project. Is there any changes in here? I'm not entirely sure what these settings are outdated, but the point is if I go in here, you can decide you want to have, this is the package version, you want to have one, two, three and I want to have a three O version and you want to have an assembly version of 2.1111 whatever. You can decide to put them whatever you want. Now, if you now edit your project file, you will find out that nothing happened and that is probably because I'm running on stale bits. Let me do one thing really quick. Demo four, is it demo four or demo three? I was demo four. It is at origin master hard. Just basically a way to say oopsie daisy, let's go back to where we used to be. Perfect. So now we are in the same state for the demo and now that looks much, much better than I expected. Okay, so three O, four O, hit save, go to the project file and then those surprises there, they're just persisted in the project file. Now, if you want to tie them to each other, it's very simple. All you have to do is use MS build. You can just say, you know what, I want this version to be the same as this version. So all you do here is basically stuff like that. So no real surprises there. Now, again, how do you increment the version numbers though? That's the real killer. You don't want to, every time you ship, remember to go into this file and do a plus one manually, right? So one thing that I have done, which works extremely well, let me just switch branches again. Well, let me first make it clean again and then just switch to the branch is git versioning. And basically that's a new guy package. It's called Nerdbank Git versioning. It's super neat. It's done by a Dev from Visual Studio, Andrew Arnold. And it's super neat because basically what it does is it computes the build number for you. And when you now go to the output folder, you see that we now have, let me just clean them all out, build them in here. So now we produce the one on one. Now let's say I make a change to my library. Let's say I add another method here. You know, let's say we have a, you know, a write that takes an object as opposed to just a write line, right? And then I say git add, git commit add support for write object. And now let me rebuild. What now ends up happening is now we get a one or two. The reason we get a one or two is that git versioning uses essentially the number of commits in your repository to decide how to drive that version number. And there's also a file you're supposed to check in, which in our case is this document here, which controls exactly what is your baseline. And so I can manually decide what my one or what my major and my minor version is because they're usually, you know, set by human being and whether I'm pre-release or not. And then everything else is just automatically produced by this tool, which means you never have to manually check in version numbers. You know, you just get same behavior. So I highly recommend doing something like that. Of course, as MS build, you can do whatever you want. You can have your own scheme, date scheme, whatever you want to do. Point being is increment your version numbers in the same way and you will be better off. All right. Now let's talk a little bit about tips and tricks in general. There's a few websites I would like to show you. First of all, a question that I get pretty much all the time is, is API X or Y in .NET standard or in which version of the standard was it introduced? You can use the docs for that and the docs work extremely well for that as well, but I've written this website because it's so much faster. I can say path.join, for example, pick the one and I immediately see path.join is only available in .NET Core 3.0. If I go to path.combine, you see it introduced in .NET Core 1.0, .NET Framework 1.0. You can see the assemblies there in and so you basically get immediately the history of the particular API, so it's super convenient to quickly see whether an API is in the standard. There's also other data we expose here, like for example, how popular is the API and you get org, how popular is it with API port, the tool that you guys all keep running to check whether you're compatible, and so this is a very, very quick way to check things. The other website that you should know is source.NET. So let's say you wanna find the implementation of path.join and by implementation, you probably mean the .NET Core implementation and so you immediately can click here, you can see that this guy just does join internal and then discuss some, you know, amazing things to do it in a fast way. We also have source of .NET, which is the .NET Framework source explorer, so if you wanna see how path.combine works on .NET Framework, you go here and then same thing, you see the implementation here, you can click on things, you can navigate the source code, you can find usages, it's a pretty rich experience to quickly navigate around to see how things work because as an application or library author, you very often have to work around bugs or you have to understand how things work and in the past what we have all done is open up reflector or I'll spy and disassemble the framework, but this is a way faster way to navigate the source code of the .NET implementations and then figure out why things work the way they do or not for that matter. There's another one that you're probably all aware of which is NuGetOrg, right, no surprises here, but let's say I wanna see the APIs, right, so if I go to collections immutable for example, great, here are the types that are in there but what members are on the type? And all you have to do here is, instead of saying NuGet, you say Fuget, Fuget, I don't know how to pronounce that, you go there and it's a website written by our friend Frank Kruger who is a very passionate .NET developer and it's effectively a mirror of NuGet. So what's cool on this website is you can actually browse the contents of the assembly so you can see, oh, that's immutable namespace, that's an immutable set and here's the API service of those guys. But even better, you can do diffing. You can say, okay, how does the current version of 1.5 differ from the 1.131? Boom, you get the actual diff on which APIs got added or renamed or removed, so it's a super convenient way. The other thing is it also shows you which platforms things are supported on. In this case, it targets NetStand at 1.0, 1.1, sorry, 1.3 and 2.0, and also portable. So you get a quick glance on how these packages work. So super, super handy website, highly recommended. And then last but not least, there's a really handy MS Build feature I would like to show you, which is called directory.build.props and directory.build.targets. And let me quickly open up my demo for that. Start page, demo five. Yeah, let me just switch to the finished product in the interest of time. So what I have now here is, let me just quickly go here. So how many of you have this problem that when you have a large solution with many projects that you wanna lock in the versions of packages you're using? And so what ends up happening here is you have a props file, directory.build.props, which is automatically imported into every project. So you can just say, here's my version number for Microsoft Windows compatibility, here's my JSON version, here's my Xunit version. And then what you can do in the individual project, you can just say, instead of using the version number here, you just say, use this version number here. What's even cooler is there's this targets file where you would place packages, for example. So I can say, well, let me actually switch to XML, might be easier to read. If the assembly name ends with test, then automatically reference Xunit. So what's cool now is if I create a project here, let's say I don't understand a project. Let me actually quickly copy this guy, say add new project. And I just name this with .tests, booyah. It automatically references Xunit because that's what I said in the targets file. So the nice thing with these targets is that they are automatically imported. So basically you have a bunch of XML or MS-built XML that is automatically available to all your projects. So this is where you can centralize effectively things like what's your copyright, what's your website, is the thing signed, is it not signed, blah, blah, blah. And if it's not tested, then automatically produce a new get package. So there's a whole bunch of things you can automate away by just using these two magic files. They are also not tied to Donets Center, they are also available for any project really. It's just an MS-built feature that works everywhere. So if you do anything with MS-built today and you are because you're building.net projects, you should take a look at that. Super, super neat. All right, so let me summarize the best practices all up for building class libraries. There's this implicit one which we don't talk about much anymore, which is the API design guidelines. There's this book on Amazon you can buy a family design guidelines written by a member on my team that talks about naming and many other things that you should be aware of as a.net developer. But in essence, what you should do is you should design your APIs, right? That's the minimum thing I think everybody wants. The second thing is do target.net standard 2.0. If you're starting with new libraries, just target 2.0 and you're probably pretty good to go for the long time. You should consider multi-targeting, especially when you have customers on.net framework 461 to make their lives better. You should definitely use NuGet to package up libraries that are multi-targeted so that nobody has to select the right binaries. You should use platform not supported for APIs. You cannot support everywhere and you should offer capabilities to APIs in those cases when that's necessary. And then lastly, strong name your libraries. Just do it, the worst is better off. We have much, much less problems. All right, so here are the two URLs you should remember. NetStandard FAQ is the one that I keep updating whenever questions pop up from.net standard. And then I have the.net standard docs which points you to the documentation pages. And I thought we had some time for questions, but apparently I can only see any questions if the website is not displaying on the teleprompter anymore. So if you have any questions you can reach me on Twitter, you can also reach me via email. I will stick around for the rest of the day, so feel free. Thank you.