 Hello. Oh my, this is going to be fun. All right. I am Sam and this is depth deep dive. So by day, I am an engineer at Stripe. And then like the rest of the time, apart from being a dad, I am working on package management for Go. So we're going to dive into depth, which is this, so who's sort of depth, first of all? See this, that makes me very happy. All right. That was not everyone though. So we're going to do a quick overview thing here. I'm not going to do like the full history, the full history of package management for Go. It is quite a story. But we are at this point now where we have depth, github.com slash golang slash depth, which is the official experiment tool. Official experiment is a very carefully chosen phrase. There's actually going to be a pile more information coming out very soon about what the tool chains, that the official Go tool chains dependency management work is going to look like. But I don't want to get into that right now. Bottom line depth is still the official experiment is recommended that you move along to this, that you migrate to this. There will be another migration process to the tool chain in the future, but will be easier to go from depth. So if you've got existing projects, you should migrate. Yeah. So depth is strongly opinionated about workflow in a way that is common for Go tooling. So I'm going to try to get through in this teeny little window of time that we have, or these two basic things, how to use depth day to day and then at least a little bit of information about how depth works and why. Most of this, this is a product of like a month and a half of writing or so, but with the last release that came out two weeks ago, we put up the stock site, which now has a ton of documentation. It is not yet complete. And the headers are terrible. The formatting is awful. I have people come and help me with my terrible CSS, but there is just a ton of information about using depth and most of what I'm going to go over here is actually covered there. So you can play along at home as well later. But depth has fundamentally three commands. Deponet, depth insurer, and depth status. And we're only really going to talk about the first two here. Dep status sort of reports the state of things. There's not as much to explore there. Deponet is for setting up a new project, a new go project. It can also try to convert existing projects. So it's pretty well at this. We have importers for a bunch of different projects, but I'm going to start the demo here. And this will be fun because I've got to do this thing with the microphone. See if I can balance it and then talk at it like this. Will this work? Nope. Yeah, that would be great. Okay. So there, thank you, thank you. All right. So we're going to go to, I'm going to pop into my go path and go to make a new directory here. I'm hoping that everyone is familiar with go path. Again, we don't quite have enough time to explain how all of that works today. But I'm inside my go path. I'm just going to go to a demo directory. And I'm going to run a simple deponet. The result of that deponet is it creates, it creates a go package.toml file, a go package.loc file, and a vendor file. There will be more. But yeah. So the go package.toml file does a couple things. One, it defines the project's route that does two really chief effects there. One is that, so everything inside of your go path is, its import path is determined by its position relative to the root of the go path. So because I just went into a directory called demo, it means that the root import path for the project that I created there is demo. It also has sort of get like behavior where you can run dep commands from anywhere underneath that directory and it will route find, it will walk back up until it finds a go package.toml file, until dep finds a go package.toml file and we use that and say, hey, this is the root of this project and all of my operations will correspond to this project. Dep does not like protest if you nest go package.toml files, but it is not at all designed to work with nested projects. That's quite intentional. So go package.toml is mostly hand edited and it contains rules. We have some simple examples of that. Wow. I forgot it's NVIM. Hang on. All right. So the basic things that we're doing in a go package.toml file are we're saying, these are the rules for how we want this project to be treated. Not going to explore all these again. Lots and lots and lots of documentation online, but mostly what we're saying is, one, these are the versions that are acceptable for my dependencies. Two, this required thing sort of lets you add, the real, the primary use case for required is, oh, I want to pull in some, I need to pull in the binary for some dependency that I have that I use to run go generate or something like that, like proto buffer or message pack or something. You can actually point requires to main packages in a way that you cannot import main packages legally. And when you do that, that will pretend like it's one of your dependencies and it will make sure that has all of its dependencies and you can version control your supporting tools in this way and guarantee that their behavior is consistent. Ignored lets you pretend that a certain import path doesn't exist. You can just black hold it. This is also very useful for some of the weird rough edges that the people sometimes run into. But the big thing here is constraints. We're going to say, hey, for some project, I want to only use, you know, I want to use version 1.0.0 or greater in this case. We have what's called an implicit carrot. Unless you put an equals at the beginning, like that, then we interpret it as saying anything from 1.0.0 to 2.0.0, December thing. And then in our last release, we also added pruning automatically. Pruning is a system which will figure out which actual files you need for your build and it will automatically remove anything that you don't need from your vendor directory. So that's great. I don't really care. Go away. All right. Yeah. So the GoPackers.TOML contains rules. The other piece, GoPackers.Loc, I'm not going to pull it up because it's empty right now. The GoPackers.Loc is strictly machine-generated, whereas it is expected that most of the time you're going to end up updating the GoPackers.TOML by hand. It is a transitively complete picture of your entire dependency graph and contains immutable references, like get commit shawans. For example, that means that your GoPackers.Loc file represents the reproducible build for your project. By having this file, you are guaranteed that anyone else, your CI system, or any other person you're collaborating with can get exactly the same version that you were working on when you generated that file and checked it in. Both the TOML and the Loc file should be checked in. It's the expectation. Deponit can also try to do conversions, as I mentioned before. Its automatic behavior is if you run Deponit inside of an existing project, then it will try to read in the metadata from that project and convert it to the equivalent in Depland. We have support for seven different types of tools at the moment, and we do reasonably well with these. We've really locked these down pretty well. We did not get many reports anymore. In fact, I don't think I've seen a report of like, hey, this obviously converted something wrong from this other project. There are some things that we cannot honor. For example, GoVender. GoVender's model says that you can have different packages from the same repository at different versions. Dept does not allow that. Dept says it is one version for the entire repository so that we do not create a fractal of combined versions that no one ever anticipated. That can make Deponit a little bit hard to get over if you're converting an existing project. There might be some things which don't quite fit in a Depts model, but once you get over that hump, things tend to be a lot easier. Again, the migration process extensively documented on the dock site. Not really worth showing here because it would be just Deponit, and then it figures out a whole bunch of things, and then you're done. Less explanatory than anything else. You can also try GoPath if you want. If you don't use an existing tool, then Dept can read off of your GoPath to see what versions of all the dependencies your project has according to your GoPath, and then use that to populate those GoPackers.tomlin, GoPackers.log files. Bottom line, though, is that once you get out of this Deponit process, you are free and ready to go and use Dept for real for normal, which means using Dept and Sure, because this is all you do all day, every day, with Dept. This is the one command. There are two additional modes for it, for the obvious things like, say, Dept and Sure add. We're going to do this real quick. I'm going to add, I'm going to go and add package errors, because that's one that most people know. See how the Wi-Fi does. Make a dummy main.go file, which just imports the font package. All right, and it's done. This gave us an error message, but if we look now at our GoPackers.log file, we can see that we've got github.com slash package slash errors, and it's, yeah, it's been pulled in, and we also can see that it is in our render directory, the only thing that we actually have there. So, yeah, Dept and Sure add is the convenient shortcut that most people are used to, and I'm going to get to the sort of underlying model in a second, and why it works this way, and why we get this error here. But Dept and Sure add is going to be a familiar thing for folks who are used to running a get command, say, hey, I want to add a dependency to my project. It's just something we added, because it's a little out of, it's a little out of sync with the way that the Dept normally works. But you can run it, it'll pull in an additional dependency. You can also specify multiple dependencies. You can Dept and Sure add as many individual packages as you want, but the expectation here is that you are going to specify the root of a repository. You're not saying I want to add github.com slash package slash errors slash something. It's the root, that's what you're adding. When you want to go and update, then, which is our next command, you can run Dept and Sure dash update to update all current dependencies, or just one or two if you run it without any arguments, just Dept and Sure update, which I'll do, and this will have no effect, because we added this. Right. So I have to explain the model. All right. Most dependency management tools, they have the separate file, right, like package.json and npm, or gem file, and that is the authoritative list of the dependencies that you actually have in your project. This is go. We decided not to do things that way, sort of more consistent with go get. Dept is built around the idea that what actually dictates whether or not something needs to be in your project is the import statements in your project. There's no work that you have to do of like, oh, I added a new import over here. Now I have to go and update a file to say that that dependency exists. That information is available. It's statically analyzable very quickly, so we pull it out of the files. It also means that when you remove dependencies, there's no bookkeeping to do. You pull out the import, and it's no longer you pull out an import, and if that was the last instance of an import for a given package, you run a Dept and Sure command, and it will just go away. But try our Dept and Sure update again. There we go. So no effect on our gopackers.lock because Packagers was already at the latest version that was allowed by the constraint that we specified. This is a key concept with update. What update actually does is it looks at the version your gopackers.toml, the constraint specified in your gopackers.toml, which here is v070 with a little plus a carrot. Whoa. Oh man, not having a screen real estate is hard. So it looks at the version that's allowed in your gopackers.toml, and we'll get the latest one. So if, say, Dave were to release a 072, then if you were to run Dept and Sure update, then it would go ahead and update to 0.7.2. But because we already have the latest allowed, Dept and Sure update doesn't do anything. In general, it is preferred that you target specific dependencies for update. Running a global update, I mean, you know, maybe do it to sort of test it and try it. But if you're having any difficulty at all, sticking with just one or two will really help isolate variables with what's going on. This is one of the nastier problems that people end up having with Glide, is Glide did not allow you to say, I want to update just this dependency or that dependency. You'd have to instead update everything, and especially for a project like Kubernetes, which tried to move to Glide for a little while, they had a lot of issues with that. The underlying thing that's going on, though, is the key idea of Dept and Sure. Oh right, now I've got like six minutes, so I've got to get to the model right quick. Okay. The reason that the command is called Dept and Sure, like this is a weird name for a command, right? This is strange. People look at this like, what are we talking about? And sure, the key behind and sure is not, I want to take just one tiny little atomic step. I want to just add a dependency or I want to just remove a dependency or update one. Instead, it is, the phrase is something like, hey, Dap, please ensure for me that all of my dependencies are satisfied, that all of my constraints are satisfied, that all of the constraints of my dependencies on other dependencies are satisfied, that everything works out and is safe and sane. And then with a command like dash adder dash update, it's also do this little bit of extra work on the side. But instead of giving you a bunch of commands that, you know, let you sort of work yourself into an intermediate and kind of incoherent state, Dept always tries to return you back to a sort of known good state that we think of as being in sync. And this is why there's no manual bookkeeping. This is all based on this idea that we have of a relationship, a functional relationship between the different components in the system. We have our actual project sources and the imports in our project. We have rules in our gopackers.toml. We combine those together. We run them through solving algorithm, which gives us back our gopackers.lock, our reproducible file that describes the whole build. And then we use that in turn to populate the source of our dependencies in our vendor directory. And we can actually see that at work in a nifty little way here. So there's a hidden command called depth hash inputs. And if we look at the output here, it should look familiar. So this is telling us that we have some constraints. These are what's declared in our gopackers.toml. We have one declared on package errors. And then that's the actual constraint. SVC is an indicator that it's a semantic version type. What's that? That's all right. Thank you, though. Then we also have this import requirements list. These are the actual things that we import in our project. If I go and I modify something, so I added the one additional import, then, hey, look, it shows up there. And I can run a depth insurer. Depth test is just a test repository for depth. Imagine that. And then we'll see that in gopackers.lock, there's depth test. So I just added a dependency without actually using dash add. Just to highlight that dash add is actually mostly sort of convenient syntactic sugar. The goal of that design, and there are still some hiccups in there, is that you should be able to just sort of stay in the context of your editor. Add import statements like you're used to doing, and then just run depth insurer. And it doesn't matter if you've only added one or 50, it will pull all of them in and resolve all of them together. It brings you back to known good state. It makes your project, brings everything into sync. Now, here's a fun little trick. So somebody shot up the name of a directory that would normally be ignored by goal list, apart from vendor test data. Okay, I'll use test data. All right, so this will be the last thing. All right, so I just created the test data directory. This is normally ignored by goal list. You can see that with goal list. That's the one that I wanted. Yeah, so it's not when I ask a list and it's like, oh, that test data directory doesn't exist. This is normal test data, or normal behavior there. So if I run depth pass inputs, then we see that also that import that I added there does not show up. This is what we want. We don't want depth pulling imports out of our test data if we should happen to have, you know, .co files in there. However, if I do this little thing, please work, please work. Oh, didn't work. I worked when I tested earlier. So did I do that right? Did I put a difference back to this one? No, that should, yep. All right. Well, when I tested it earlier, it worked. Hang on. Yeah, okay. Not working right now. I don't know why it's not working right now, but I actually, I may have done it wrong. Wait a minute. Did I import it wrong? Well, the thing that actually does work, I promise when I'm not screwing something up on stage in a live demo, is that depth is smart about seeing, because it is not technically illegal to import things in test data, and I know you're thinking, why would you do that? And my question to you would be, why does a protobuf repository do that? Because there is a real protobuf repository that actually imports something in test data from not inside of test data. But whatever it is, depth is smart enough to follow these import path chains through projects and, yeah, know when it should actually include those things. It would then, normally if I'm not messing it up, yeah. It would normally pull that one and see that one. I'm out of time now, right? One minute. All right. Let's pull in client go from Kubernetes. Let's do that. Why not? Let's see if we can do, oh, yeah. I want to see this work or not. Did I spell it right? I spelled it right, right? That's Kubernetes. That looks good. Okay. All right. And there's a sub package that I wanted to use out of this one. Which one was it? Let's see. It was the traffic. I think that's one. Let's see. So depth is slow right now. This is a nice demonstration. There are a bunch of pending performance improvements that we're able to put in. But in a different talk, actually the dot go talk that I gave, I talk about how depth is really sort of based around functional design principles. This idea of functional relationships between states. And similar to a functional language, it relies on, it's designed around immutability in a way that allows us to take aggressive advantage of caching. Yeah, traffic doesn't exist. I forgot the name of the package. Oh, well. But yeah, we, it's designed around these functional principles but hasn't yet implemented the caching routines. We're on that next. Thank you. Awesome. You actually have Q&A time where we're setting these up. So I actually have a question. I know, but I have the mic. So yes. So my question is, should you or should you not submit the vendor dependencies to GitHub? I need that. That is your choice. We recommend that you commit both your GoPackers.toml and your GoPackers.lock files. Other tools might say you do different things with your lock file, depending on whether you're like a library or an app, we don't make a distinction. You are just a Go project, no matter how many binary, no matter main packages you do or you don't. But your vendor directory is up to you. The only effect of committing your vendor directory is that if you are like a pure library and you commit your vendor directory, then you can give your dependers a headache, folks, depending on your library, if they are not also using DEP, but if they're using DEP, then they're fine. Question over there? So he asked why it is that when running DEP in it, some of the dependencies go into the GoPackers.toml file. So your question is some, but not all. So at the moment, constraint directives only function on your direct dependencies, not on your transitive dependencies. So this is doing a conversion from some existing tool. This is just on a new project. What is it adding constraints for? So you're running it on an existing project of some kind. Okay, okay, yes. So it's basically trying to do its best guess job of saying these are a same set of dependencies or same set of constraints for the current dependencies that you have based on the current state of the world that we could infer. DEP is a best guess basis. It's not intended to be precise. It's intended to get you most of the way there. You'll probably have to tweak after running it a little bit, depending on the complexity of the project. Other questions? Yes. So we fundamentally can't solve that problem. Yes, sorry. Eric asked how we handle aliases, the fact that there are different sort of routes to a package. We fundamentally can't do anything about that. The only, I'm done? Okay. We fundamentally can't do anything about it. Like it's kind of a deep problem in the way that Go is structured. The fact that import comments exist as of Go 1.4 that allow packages to say, I must be imported at this import path helps a lot. Some also future changes that I've been discussing with Russ for a while may also help a lot with that too. But basically, they're two separate things. And we don't know that they're the same. It's the same as basically any other tool. Okay. I'll be outside if anybody has follow-up questions too. Thank you.