 So, this is a talk that I wrote because I was frustrated at work, which I'm sure you've all been. Never. Never. You've never been frustrated about tests at work, so this will all be a wash for you. I gave this talk two weeks ago, and I thought this joke was rather clever, so I'd rather, I'd like to give it a shout out. So, I guess, so, you know, Sam's really heavily featured on the stage this conference, so, yeah. What's with the goat? For you folks who don't know what I'm talking about when I say, obey the testing goat, this blog post is where I found out what this actually meant, because I'd heard the reference several times and too young to have been at Picon 2010. There is a test book called obey the testing goat, and I hear it's quite good and have been recommended at many times, so I hope that this person does not feel like I stole it too hard. Okay. So, I'm an early career. You might be able to tell that because I don't look very old. Mostly I do backend stuff, so that's where I'm coming from with the testing perspective on that. And I've been in my current company in Mexico for three months, so, you know, that doesn't give you a whole lot of credibility to try to sort of start coming in and changing a lot of things, so this is just what's kind of been what I've been working through for the past couple months. You may take mileage out of it, you might not. So what's the actual problem? For me, we have no unit tests on a specific part of the code that is not necessarily customer-facing, but certainly important and gets a lot of traffic from our internal users every day. That's unbelievably dangerous to me. It makes me anxious all the time. I'm freaking out any time I push code anywhere. I don't want to touch code like that. There's no unit tests. There's no unit tests. Okay. So it's not just an engineering problem. This is also a people problem. It's both because it affects both sides of how you have to sort of fix this problem. So on one hand, it's a culture problem. You have to change the social behavior and accepted norms in your coworkers and therefore your group. And we all know this, changing culture is really hard. It's not something that you can just do because you've set down a new law and magically everything works, unlike maybe sometimes code where sometimes just stuff magically works. So changing culture requires two things, repetition and reinforcement. You have to keep pushing even when you feel like you're making no progress and you just have to keep trying and keep working at it and keep being a role model and so then maybe you'll get where you want to be. One thing that we're really lucky to have on our side in this respect is that most people, I would assume most people here at least, want to be good at their jobs, otherwise I'm not really sure what you're spending on your weekend at a software conference. So most people want to be good at their jobs, even if they're not here right now for whatever reason. So instead of having to imply that you have to write tests just because you have to, you can imply that being good at your job, being a good developer means writing tests, which is a nice little framing there. So in regards to it being an engineering problem, regressions and breakages are genuine risk. I have no idea what's not working in my code right now. I genuinely don't know. I know I found a bug a couple weeks ago where in the last six months we changed which version of the SDK we were using for Amazon and in the last couple months we found out that, oh, uploading a certain thing didn't work. Well, how are we supposed to know that without tests? Like this is just not going to go well for any of us. And it's an engineering problem as well just because we were missing in test infrastructure. If you have to do a bunch of work, if you have to mock a bunch of things yourself, if you have to do all of this manually because no one else has done it before or it's not easy to use because it's hidden somewhere or not shared in some way, it's a lot of work. And you're not going to do it because developers are fundamentally lazy and that's OK. Testing is also a skill that not everyone really knows how to do. I'm sure you've all worked with a new grad who doesn't know how to write a test because it's not part of the curriculum that they went to, at the school that they went to. And that might be frustrating to you, but it's a reality. And just giving them that framework gets you a long way of how to actually write tests that are testing what you want. And debugging gets harder when you don't have tests or you don't have the right kinds of tests. I'm sure we've all run a test, put a break point in somewhere and just use that instead of having to set up like 10 different things. If you don't have these unit tests, you can't do that. You have to now set up two or three different things and call your code somehow in a mysterious way. And nobody wants to debug if they have to do all this work to do it. So realities of life. You cannot write all the tests for your code base yourself. Or at least you shouldn't unless you're the only developer, in which case you get a pass for this whole culture thing. Tests are great, but they don't add business value visibly the same way that now you have a social media button on your website. And that adds business value, apparently. So your manager will approve the ticket for that. Strong arming your coworkers has limited value. You can't waterboard them. It's illegal. You don't want to work places where you'd have to do illegal things to get your job done. Please don't do stuff like that to get where you need to be. And finally, tests are useless if you don't run them. So if you don't have CI set up and you also don't have unit tests, you probably are in a really bad place. But if you have tests and then you don't run them at all, there's no CI, that's something to fix. In contrast, good test suites tell you that code meets the requirements. So encouraging people to write them is going to help them clarify what the requirements that they need to be testing are. It's free documentation of intended usage. The person who wrote the code is showing you, by writing the test, calling the code, how you want to be using it, how they think it was originally intended to work. They have instructions on how to set them up so that when you get a new developer or when your team member from some other side of the company has to come and help you or wants to submit a bug fix, they don't have to read your mind or read all of the code until they can figure it out. And they actually test things. What do I mean by that? I know you've all seen a bad test before, but in general, they do nothing. So this test is going through a terrible piece of code that I wrote in the last half an hour to parse an email and split it so that you get the local part and the domain part. And we're testing it. And the code and the test do the same thing. And that's really unhelpful. And I've seen tests like this in production, especially with regexes. Don't put the regex in the code and the test. That's not helpful. It's upsetting. You're hurting me. Don't do it. But you've all seen tests maybe not exactly like this, but they don't really test what you're looking for. Or God forbid, they don't have an assert. I'm not OK. None of this is OK. So try to actually test things in your code. Good test suites are also documented enough on how to run them. They don't mock things if you can help it. If you can help doing anything, don't write mocks. It will make your life a lot easier when you realize that the weird bug that you're experiencing in prod but not in test is because the weird interaction between alchemy and Factory Boy and Postgres all comes together in a crashing halt because Factory Boy wants to increase the ID of everything by one. And Postgres also wants to do that. And everybody is very confused at this moment. Tests are organized so you can find things. Getting people to write your tests in a way that you can go and look for them later or a new person can infer where they are based on names and file names and directory structures, not the worst idea for getting you a lot farther in your testing journey. And using planned infrastructure is very, very helpful for that as well. What do I mean by that? Let's say you have an internal API server that you're running tests against but you don't really want them to be unit tests so you're sort of mocking this out. If everybody uses the same mock that is in a standard testing library that you expose to everyone in your company so that they can all use it, it is more consistent. It is more likely that someone will actually update it and realize, oh, that we need to update the mock, the standard mock of our API every time you just have a better shot of actually being okay when your tests run. So this is a real, real thing. Nobody is always willing to write tests. You've definitely seen someone who said this and it's frustrating. But why are people not doing this? What are we fighting against? So in my opinion, people don't write tests partially for a couple of reasons that sound very similar to me from trying to get children to eat their vegetables. There's no consequences. If your kids are not eating their vegetables and then you're just like, okay, that's fine, here's cake. Like, they're not gonna do it. The same thing goes for developers. Oh, you didn't write a test? Well, I'm approving your PR anyways. This is fine. They're never gonna do it then. So your coworker has officially been trained to think that you are completely fine with testless code, which you are not, by the way, for the record. No one's watching. There's no social pressure. So when your mom leaves the room and then you feed broccoli to the dog, this is the same thing as not having code review. It's not great. Some people don't find testing fun. I'm one of them and I don't enjoy writing tests. So they're not my jam. But they're more important than anything in terms of basic, basic engineering reliability shots. If you want to actually have good code, you're gonna have to do it. So you kind of have to suck it up. Just kind of like being an adult and having to eat broccoli, even though you're not really a fan. And some people just don't know how to do it. So they're just not going to figure it out. Maybe they've never used pie tests before and you have to explain it to them so that they're willing to sit down and write five minutes worth of unit tests. These are less vegetable related. If your current test suite doesn't pass, I don't want you to leave the room right now, but you have a current emergency in your code. If your master branch of your unit tests do not actually pass, this whole talk is not really for you anymore. You need to go home and you need to think about your life choices and you need to have a nice, serious thought about what you're doing with your life. And then you need to go implement a CI server and make sure that GitHub doesn't allow you to do this ever again. And lack of testing infrastructure support. So again, no standard tested mocks or tools that you need to set up. Maybe you really, really, really want to try hypothesis for some reason and that's the only test that you're going to be able to write at your company, which is a fuzzy assertions test. Maybe you don't have a database that you'd like to run against and no one has bothered to set this up and you're just writing unit tests that don't really touch the database. So you're like, why bother? Just various things that are just hitting you because it's all of the yak shedding that you don't really want to have to do. Testing can help you improve correctness, which is great because that's kind of the point. And it gives you that free certainty about whether upgrading SQLAlchemy is going to break your code, your friend's code, the other guys who you've never met but works in some other lab in your company's code. Like everybody will be happier and less anxious when you have to push to master. And it lets you be the first user of your code, which I think is really important as a concept. When you're first thinking about how to call it and what you're looking for as of expected results, it makes you a lot more aware of what the actual requirements and the value of your code actually is going to produce, which is kind of a great side effect because most people are thinking about tests from a correctness point of view. And they prevent that feature rot where the implementation detail has changed slightly and you realize that it actually completely has downstream effects, which you didn't realize. But if you have a test for it, then that can't happen because you can't check the code in. So this talk is actually about trying to make change. So you need to set some well-defined goals. You want to avoid infinite yak shaving. This is not going to go well if you go to your boss with a proposal that says that you would like to now write a test for every line of code that you currently have in the company. Don't do that. They're gonna say no. And then you're gonna be upset at me for telling you to do that. And I'm gonna be like, that's not what I said. Try again. So my current scope at work is the following three, the anti-regression concept. If you write any code at all, bug fix of some kind, write a test. Somebody complained about this. You noticed it. It was clearly a problem. You now need to test for it so that it never happens again, which means that you don't ever have to write an email being like, we know it happened once. It won't happen again. And then a week later it happens again and you have to write the same email and you look like kind of an imbecile, which is not really a great place for anyone to be in. And new code, new tests, that's kind of obvious. It's sort of a great starting point for adding test coverage to your code. And planning some projects for us to get some standard testing infrastructure developed, I discovered that we run some functional accepting selenium tests against a prod mirror. And it has to have data in it, but sometimes the tests fail if the data from prod is not quite what it was expecting, which what the hell are we doing? Like there's no point in this. It's not going to get you any reliability whatsoever. So we're working on adding stuff like a dockerized test environment to generate a database for us at test time, which shouldn't really be a rocket science, but, and it's not. It genuinely is not going to get us to the moon, but it will certainly get us from sending emails saying, oh well it failed because one of the users that we have has a special character in their name. And these functional acceptance tests don't really understand that yet. So you're getting an email about this. So you can have other options. You don't have to do everything that I tell you to. I mean, actually don't do that. That's a terrible idea. You can find an old piece of code that nobody is really working on, but it's still in production or in use at your company and write tests for a very specific part of it. Maybe you're preparing to port it or completely change what it does. Having tests before you start making big changes is super useful for not breaking things. Or maybe your company is about to start a crazy project where the kinds of test skills that you have for maybe testing web APIs don't really line up with what you wanna do in the future. Maybe you wanna write a compiler and testing a compiler is a whole different mess and a whole different concept than testing a web application. So you might wanna develop your testing skills in that area, learn more about a certain thing. Putting scope on both of those things is really important. You can't say let's get 100% test coverage on file X, Y, and Z. You probably wanna aim for a lower number than that. It's totally up to you. Learning-wise, also setting a scope is important. Setting a goal of maybe a question that the person can answer, just to prevent this from dragging on forever in terms of time spent at work. Tickets should be limited in scope for that reason. So before you do all sorts of things at work, you generally need someone to kind of be okay with it. You can't just do whatever you want. That's not usually how most workplaces work. If you work somewhere like that, I'm not sure I wanna work there. It might be in your favor to get co-conspirators. For me, I'm a young new developer or newer developer. People at my company have been there for longer than me and so if their opinion is that we should do the things that I'm suggesting or I'm making suggestions to do, I have a lot more of a shot of convincing all of the other people if that other person might give me some credibility. And they also ask, act as reinforcement. You do not, or I hope you don't review every PR for the code that you're responsible for. That's a lot of PRs. There are other people in your team that also do pull request reviews and having them on your side is super useful so that you can make sure that other people are actually writing tests. So before you try to convince anyone, you need to have a bunch of information or it would be ideal for you to have a bunch of information to convince them with. So if you've had outages, being able to mention ways that having tests would have fixed this outage is frequently a really good motivator. You've all woken up, or some of you have woken up at 2 a.m. because of a bug that if you had a test, it would have fixed it. See why we should write tests guys? This is so motivational because we all like sleeping. If you're planning any big upgrades, maybe you're planning to upgrade to Django 3.0, which has been recently released, and you don't have any tests for it, for any of your web application, you might be terrified of this and for good reason. That's a terrifying concept. So the major upgrade will be a lot less terrifying if you are able to have tests preventing breakages in advance. And having some good examples of tests that you like, don't use the one from the previous couple slides ago. It's not a good test that you like, but having good ones to point at, to give to new developers can be really nice just to give people a crotch. And a heat map of code coverage. Code coverage as a whole sucks. I think we all agree on that by now. As a number, it doesn't really tell you anything, but having a heat map of, oh, well, this code over here, which is our very critical safety code that says not to kill people if it's a killer robot, should have tests, please. So you have information, your informational power is here. You should probably talk to your boss or your team lead or your person who sets your priorities somehow. And you basically need to ask them what you want and framing your goals in alignment with what they want, like their problems in life are, is really helpful. That's gonna get you a lot farther. So maybe your boss cares about customers experiencing less disruptions or maybe they're really tired of hiring constantly because all of the people who are really frustrated with not having any tests or sufficient tests have disappeared and quit and gone to work at Google. Or maybe your developer, your dev lead is really upset because the project manager and the customer experience folks do not talk to the developers and the developers do not talk to the other people and improving that communication because you've written tests might make your boss a lot happier. So your reactions might vary. Sometimes you're gonna have to just do it anyways. You should probably write tests even if your boss does not want you to. But you probably don't wanna work there if you have to ask permission and or forgiveness for that. The culture is now officially broken if you can't convince someone that you need tests. It's not good. And no, it's not a visible customer feature. And no, it's not something that will gain you $50,000 of revenue in the next six months. However, it is going to make you not have to hire new people when all of you are burnt out because you keep getting paged at two in the morning because your code sucks. And my code before I test it sucks. This is not an indictment of people coding. This is an indictment of people not testing. Your boss might also be super enthusiastic and completely not aware of why this is a problem that you haven't had before, but like, oh, you wanna do this? It's your pet project. Go nuts. I don't know if I like that reaction either, but it is what it is. So what you're gonna want from your boss, especially if you're me and you don't actually manage people and therefore have any power in any way, is permission to do a bunch of things. You want to be able to talk to your coworkers and tell them that this is now a thing that you're doing and that they are also doing. You want to add some of this to onboarding. If you're onboarding and you're not going over your testing framework or the things that you do to test, you probably aren't doing any onboarding at all. You should probably start doing some onboarding, some informational sharing with your new developers and you should talk about testing during it. And then you need them for your boss specifically for performance management. I can't yell at people for not doing stuff at work. I mean, I can be like, hey, I was upset that you did this, but I can't be like, also, you have six months to get your stuff together or else we are going to have to let you go. Performance management talks. Your boss is the person for that and that's why you need them onboard with you. Getting these tests written. And then finally, approval to add a bunch of stuff to your test load or your project load. For us, it's all of the unit testing infrastructure that we're developing so that it's not that it's a lot of work. Just you need to have it scheduled and to do it. And your coworker chat. You need to convince people that this is a thing, which is also really useful for that, why you've developed all that information about outages and not being woken up at 2 in the morning. You don't need to go over specifically how to write tests unless you hire only junior developers and haven't developed any of them past being six weeks out of school. You do need to set expectations for where they need to be going going forward. So with my previous goals of new code gets new tests, this is sort of where you would mention that. And also where you would mention the fact that you're developing the new infrastructure for having tests occur. You can also mention some nice things. Writing tests is really nice for decision making clarity and it's free docs. It is free usage docs. How do you use this code? I don't know, go read the test, go find out how it's used and then you can use it in your own code. Which is kind of motivational because if you aren't writing unit tests you probably also don't have any documentation. So setting your new expectations can sort of go over also what your code review should look like. If you're having a review and you notice that someone hasn't written any tests or no tests are touched in the code that it's being changed, then that's a thing that you should mention in your review comments. You can ban people from merging stuff before they've written enough tests or they've written too many tests or the tests are generally not particularly productive. And in general just going over those things so that they know what you're expecting from them as a team now. So now that you've done all this expectations adding you need to continue maintaining this. This can't be a 30 day cleanse sort of trend. It has to be a journey or a part of your reliability focus at work. So it is genuinely a thing that you need to continue working on as part of your work life. So continuing education, if you find tests maybe you read open source code in your spare time or maybe you just found a test in your code base. Sharing that with your team can be like really helpful for reminding them that this is a thing that you do now and that you must write tests. Having a code review checklist just as a subtle hey, are there tests in this PR can really help people remember that that's the thing that they're supposed to be enforcing. And having metrics, code coverage sucks, I agree. However, having the numbers and saying oh well it's getting better now can get you a lot more buy in or a lot more awareness that oh you have a fire. No one's actually doing this even though we said it's a thing that we have to do now. And this is how you would find out potentially that there are consequences and performance management requirements if someone refuses to write tests. You can't waterboard them but you don't have to work with them. Planning future code, you are now a person who has tests at your company. You should not plan code without thinking about how testing that code is going to happen. If you're gonna add additional dependencies like Celery, if you're gonna start having microservices, you should make sure that you're aware that those things have additional testing requirements that are maybe not what you're familiar with and thinking about that is super important. Having infrastructure for testing is important. I, Docker isn't the only answer for sure but it is a good testing in a box solution even if you don't necessarily love it for various production reasons. So let's say you decide to go for distributed test architecture or distributed systems architecture and you don't know anything about it and you want to write tests for it. Maybe don't architect your code that way or spend some time learning about it. Thinking about these things in advance is really the key and when you're writing tickets, giving people the planning in advance heads up that writing tests or writing additional mocks because you're adding a new service is part of making sure that all of these tests actually get written in the future. Stuff to cover in an onboarding meeting as part of your maintenance. You're kind of just bringing someone up to speed on what your culture looks like in this sort of meeting. You need to help them make sure they can actually run the tests. You've all had a developer who's come in and three weeks in they haven't run the test before and I'm not really sure how they're developing. I don't know if all of us can juggle mental models like that in our heads and then also be very good at what we're doing. Making sure that people can do that is really helpful to get them up to speed a lot faster. Going through the code review checklists just so they have an idea of what's expected of them. Not everyone's code review checklists look the same as yours. So giving them that heads up of what they're expected to do is a lot more orienting than just throwing it at them and saying good luck. First code review, here you go. And it's definitely helpful to give them examples and consider pairing with them. If you've never ridden a test at your company, maybe there are weird things that you don't know about. Maybe all of the test infrastructure, all of the mocks, all of the factories are kept somewhere that you don't know about and you need to be shown where they are. And that's a lot of orienting helpful information right there. You also need to accept that you can't test things. So you can't control every input to your code. You probably can test most of the way there, but you should not be that obsessed with it so long as you are aware that that is a thing that you are being obsessed with. Scaling, load and concurrency are all complicated, especially if you don't have any tests at all right now. It's not something that you can immediately start magically testing. It's something that takes a little more effort. Proud and dev are never the same. You may discover this when you find a bunch of random code that basically says, if NV equals prod, do this. If dev equals dev, do this other thing. Well, how is your test going to go if you can't test in production exactly like that? And there's also just never enough time. You can never write a test for everything. You shouldn't write a test for everything and tests are not going to be the only way that you maintain reliability. So you should not forget these things. You should not forget to write regression tests. Bugs, catching them, it's really nice to have tests for that and test their free documentation. So thank you. Thanks, Dwayne.