 Hello, everybody. Once again, I've run the most brilliant scam in the world. I'm speaking at a DevOps conference, and I'm not going to talk about computers or DevOps, but I am. The title of this talk is called, Piloting Your Project, How to Avoid Anti-Patterns During Planning and Instant Handling. Or as I like to call it, I would love to stop worrying and love my stupidity. Who am I? I'm a DevOps engineer with 15 years of experience. I've worked at AC, Associated Content, the People's Media Company, and I've also worked at Yahoo, who has purchased Associated Content, Fun Times, Photo Bucket, one of the largest photo-sharing sites on the Internet, and Ping Identity. I also have, ooh, that's jumping ahead. That's interesting. I also have a variety of hobbies, like long-distance motorcycle riding where you ride for 24 hours at a time for 1,000 miles, multi-day river rafting trips, and piloting death traps that were made in the 60s and haven't been updated at all. The consequences of doing this badly is that you don't get to do it again, unfortunately. Which brings us to a point. These are all problems. That is a problem with making two points, not contact a third point. That is a problem with drowning. That is a problem with gravity, because whenever that goes up, it's coming down. The question is if it's coming down when you want it to come down. When we talk about projects and instance, we're also talking about problems. Everything is a problem. Everything is a problem. Although we'd like to say this often, I want to come in in the morning and say this, everyone here is in the solution business. We go through, and we wonder these problems, and they're the same problems. We do not have solutions. We ask what's wrong with our process when we fail. We think we know the problems. Who's seen this little word cloud? Anti-patterns, organizational dysfunction, information asymmetry. There's a new buzzword every single month. Someone in this audience is going to invent a new buzzword, and I want you to hit me up on Twitter and give it to me, Jay Brack, within the next month. That's not a bad thing, but the buzzwords aren't solving the problems because we think we also know the solutions. Let's go back. That thing lets you jump ahead. Still, problems remain. Instants happen, and we keep on fighting fires. We're up for 24 hours because somebody pushed the wrong UCS config to a beach assie and just wiped the whole thing out. Fun times. That was 36 hours without sleep. I personally believe that we're answering the wrong questions, which makes us wrong. We just need to be wrong, and we need to be okay that we're wrong because we refuse to accept basic limits on human perception, processing, and performance. We think that a new issue tracking system or a test harness or a methodology is coming to save us. Everyone here is looking up to the sky and waiting for Superman to come and save them, and I'm guilty of this as well, and it's just not going to happen. We're wrong. We need to accept that we're human, and we need to look to the past, to things that have already been done, to understand our limits and do amazing things within them. We could spend time measuring and experimenting and finding the limits of human cognition as it relates to development and as it relates to operations, and I could probably get funding for that as a startup. Stop jumping ahead. See, it's like a kid that completes your sentences before you're done. So, but we're in DevOps, and the great thing about DevOps is a good engineer creates, but a great engineer like everyone out there steals. Knowledge is domain independent, and since it's domain independent, I'm going to go way outside the DevOps field, and I'm going to go steal from the federal government, because they spent billions of dollars figuring this crap out for us and cost literally tens of thousands of lives. And how often do you get to steal from the government and get away with it? I like to steal from the FAA because I fly a lot. Why do I steal from the FAA? Lives are on the line. They have very few catastrophic failures per operation. Flying on one of those is like a hundred times safer than driving yourself to this convention. When something happens, you hear about it on the evening news because it's a big deal, but think about how often that happens in relation to the number of passengers flying. They're very safe, and they have some valuable lessons to teach us. The industry quaces quite a few challenges that we face, actually, and they've tried many solutions, and they've kind of codified these solutions. It's a giant book. It's huge. Many words. Can I get someone to come up here for a second? I've got a trick question for you. I'll get your butt up here on the page. What's that written in? It's in English. That's actually true. It is in English. That's the de facto language of aviation. No, to get kind of dramatic, that's all written in human blood. There's not a single thing in the first half of that book that somebody didn't die for, and they were like, holy crap, yeah, you actually shouldn't get blind drunk and fly. 91.384, we'll pull your ticket if you're alive. The death is the FAA's favorite enforcement method. There's all sorts of penalties in there, but you're usually dead, so it doesn't matter. So they have all of these pages, and you die. You're authorized to kill up to 200 people, they die. And in one really unfortunate event, because we just didn't see it coming, thousands of people died. And that was bad. But they spent years on these regulations. We have all these regulations. We have all these pages. Everything should be fixed. No one should die in an aviation catastrophe. But still, things like this happen. And hopefully, it's not too disturbing for some people, but if you've got audio, can you click Play on that presentation? Is that possible? There's a little video bar, otherwise I'll do it here. All right, let's see if the lab will pick this up. 82 people were on that plane. 72 of them died, because the pilot landed on the wrong runway that was parallel to one another. There just happened to be vehicles on that runway. And he made a number of procedural errors and a number of decision errors that we're going to go over that could have totally been prevented. So they're sitting there, and these catastrophes are still happening. Most people are still dead. And the FAA did some serious soul searching. They got this giant thing down here. And they've done all the rules. They've done all the regulations. They've done all the methodologies. I mean, things should be iron clad, right? So they did some study. And they really did some soul searching. There was a huge working group on it. Probably cost about $25 million a year, I think. I read somewhere. And they finally sat there and faced the elephant in the room that nobody wanted to face. And they began enacting changes. And the results here are startling. A dramatic decrease in aviation. Instance year over year. Stop getting ahead of me. I am going to ground this controller, eventually. So dramatic decrease in aviation accidents. I mean, it's very safe now. And we have to ask ourselves, though, what changed? Did they write more regulations? Well, yeah, they did, because they love to write regulations. There's people that really love to write regulations there. They all boil down to everything's your fault. Fascinating fact. So they face this fact. And they face this really sobering fact, which seems really simple. And it's that people are the problem. I'm the problem. You are the problem. We're all the problem. People cause disasters. People cause us to fail at finding the solutions to problems. In computers, this is more true than anywhere else. People invented everything to do with computing. We're solely responsible for the outcomes in any situation that involves them. So it, of course, makes sense that people are the problem. But how are people the problem? Repeat after me. You are stupid. Come on, we can do it. You are stupid. I'm stupid, and that's OK. And the point is that humans have fundamental limits on intelligence, memory, and perception. And everyone's thinking, duh, but get back. Thank you. So we have these fundamental limits on our ability to solve problems. Memory, perception, limits are striking. They affect all of us. You may think these limits apply to you, but they do. You're not special. And the first way that you're not special is stop advancing. Memory. Notes, please. All right, seven piece information in working memory. That's it. This is why AFO numbers are seven digits long. Fun fact, if I say 303-555-1212, you may think that you can picture that whole string in your brain. What your brain actually does is go 303 and pull that from short-term memory. And then it goes 555-1212. If you try to visualize it, it's really difficult to visualize more than seven numbers. So that's all the RAM we've been given. If you go to a neurologist and get a workup and you've hit your head, they're going to see that you have your seven pieces of information. If you do, they're going to ship it back into the real world. We're also pretty horrible about abstraction. We can only handle three levels of abstraction, three levels of display in our working memory if you're working on something. If it gets beyond that, you're done. Here's a practical example. It's really annoying. If I say A connects to B and B connects to F and J, and J and D connect to Z and Y, but Y connects to B and Z connects to A and B, I have $20 if anyone comes up here and draws that. Yeah, you didn't commit it to short-term memory so you could never get it out of long-term memory and pieces. You can't do that. But we routinely write code, by the way, that A and B replaced with some methods. I actually looked at a piece of Java code that was like this. It was supposed to be this small atomic piece. And I spent like 10 minutes starting going, what? Huh? And we write stuff like this all the time. We say, well, the application's more complicated. I said, yeah, but you can break it down to a reasonable level, or you can break it down within your methods and annotate it. There's little tricks you can do, but these, like, incomprehensible monolithic blocks of code. I wonder why we've introduced, like, horrible errors? We can't actually understand them. We have the delusion we can understand them. We can't. We're also really, really horrible at processing information. We can process four bits of information a second. Four hertz. That's the clock rate of your brain. It's the clock rate of my brain. It's the clock rate of all of our brains. But we frequently design systems that exceed these parameters and expect people not to screw it up. And it's the definition of insanity. In addition to these basic limits, we're extremely prone to errors. Super prone to errors. I don't know if this shows up in the projector, but otherwise, that image looks like it's moving to most people if you get far enough back from it. It's a static image. It's not an animated GIF. Sometimes it doesn't work on projectors, but I'll show you backstage if you want. Our brain fills in blanks and plays tricks with us. It causes us to remember things that never happened. And it's because our brains aren't a logic-based computer. It's an iterative pattern-matching engine that's massively parallel. We're not spock. We're not logic. We all like to think we're logical. Brain just doesn't do that, unfortunately. We're also really prone to procedural errors. Oops. I stabbed myself in the hand cutting vegetables. I dropped a brick on my foot. I dropped a server on my foot. True story, because the phone rang and I was carrying the server and I went, phone, bam, broke two toes. Procedural error. Brain's not very good at that stuff. Ran through a red light, even though I saw it was red. I, not a perception error, I saw it was red. It just never clicked that I should stop. We've done that. Raise your hand if you've never done that. Yeah, that's what I thought. I took off without enough gas. Stop and go back. You're spoiling the fun. This is the most common way pilots die, actually. They take off without enough gas. It's awesome. Number one way to kill yourself. I know you'd think, right? We'd check this. Provision a machine with too little disk space. Once we recognize the problem, even if we perceive it, even if we don't make a procedural error, we're biased towards action as a survival item. And there's a gender difference on this. Men are much more prone to this than women. Men have what we like to call the slay the dragon syndrome, where it's like dragon problem act, whether or not it's the right action. Women tend to look at the dragon a little more often and say, okay, that's a dragon. It breathes fire. Think about it a little more. Both issues have their problems. But once we recognize this, we're biased towards action. But it's not always the right action. Decision errors. Clearly this person has made some bad decisions. There's also someone in this audience somewhere that has woken up with a tattoo they don't want to talk about after a night of drinking. Yeah, I hear you out there. Errors and decision are frequently brought on by hazardous attitudes. And there's four main hazardous attitudes that bring people down in their decision errors. Macho, I'm the shit, I can do this. Resignation, this is the way it's always been. It just sucks around here and I can't fix this. Anti-authority, screw you, you can't tell me what to do. I'm gonna do it my way. And invulnerability, this system everyone's worked on is so awesome, it can't ever fail. I don't even need to check it. A lot of these go together. You will frequently find the three musketeers of macho, anti-authority and invulnerability running together, screwing up your decision-making process all at once. They're super good friends. Another thing we often see is diffusion of responsibility and everyone's seen this and it's kind of a cliche, but that's not my problem, someone else will do it. It's all of our responsibility, I'm sure someone else did it. Well, yeah, but if you don't know who's responsible for the problem you're working on, you're already doomed to failure. And not in a blame type way, not in a he's responsible for it, it's his fault, put his hat on a plate, bring it to me. It's more of someone has to take up and step and take ownership. I said earlier that this thing says that everything is your fault. It actually literally says everything is your fault here. Everything is your fault. Once you assume that pilot and command responsibility of this problem, get back, it's your fault. And I take the same responsibility when I do a solve a problem in the DevOps world. I am the pilot and command of that project that I have chosen to take. I think before I do it, but I'm in charge. The buck needs to stop with me. I need to make the decision on that pool and the resources, that's me. Because when it gets too diffused, then you wind up six months later going, hey, that thing we're supposed to do. I thought Bob did it. Bob didn't do it, oh crap. And then we go on. So, I've told you all the problems. We've all agreed that we are all stupid and we've come into the 12 step meeting. What do we do to fix these issues? We can't fix people. Everyone tries to fix people. Everyone tries to corral them in with little methods. You can't fix people. People have probably learned this in the business world or usually more often in relationships you're like I'm gonna fix that person. That never works. So, the question really is how do you solve problems? And most people think they have a good problem solving methodology. Maybe some of you do. But if I go out in the audience and I pick 10 people, I'm gonna get about 10 different answers. And I don't know. If you're not on the same page with your decision making methodology, you're gonna have problems in my opinion. And that's one of the things they realized when they were looking at the human error and they said people are the problem and they asked these people, how do I know these wildly different ways of making decisions? And they're like well, I'm never gonna get to the bottom of this if you all make your decisions in wildly different ways and sit there and have this primarily decision error, macho attitude that's like, well my way's the best so it works for me. So if we deline our decision making process, it works a little better. And the next slide contains a decision making process that I borrowed. I actually lied, I stole it because steal from the best invent the rest. They like to call this aeronautical decision making. Delete the word aeronautical, this is just decision making. It's called the three P's, perceive what's happening, what's going on, what's my problem. Process, what can I do with the least level of risk to do this? Perform, do it right away. Don't guess, just do it. Do ask yourself before you do it, could the end result of what I'm doing be worse than what is happening now? The key to this is small atomic easily revertible actions that won't make anything worse. Think small, think stupid, rapidly iterate and make sure you can reverse that whatever is gonna happen. Oh, timing so much. And yeah, if the end result's not worse, ship it. But Jess, Jess, we're not supposed to ship it, we're not supposed to do it until it's done. Well, maybe you're shipping something too big. Don't always just ship it, but buy us towards action, buy us towards action. If it's small enough, if you've used the correct decision making metrics, you can roll it back and you can go back. And in DevOps, you see like a canary style workflow where you can go right and your code goes out, that's kind of just ship it. Continuous integration and deployment is just ship it. That's how it works. And that's really kind of almost the embodiment of some of these practices. So, and I talk a lot about like buy us towards action, not inaction, which really begs the question, why do we buy us towards action? We buy us towards action because the perceived process perform model is the basis of all human evolution. Fortunately for us, if we go through those processes wrong, it doesn't end in death usually. So we get a bit of a break there. I don't think many organizations could justify the rapid iteration with a disposable human capital. But it's a delusion to sit there and think that we can craft a better system than what's going on right there, than that loop very quickly. We're stupid, we're simple, get back. But if we work within these constraints, we can achieve great things. And let's reframe this from a DevOps perspective. Barney's gonna help us. Sprinkle some DevOps on it. That fixes everything, right? Just toss a little DevOps on it. I like to go through this as look, think, make, ship, learn, plan. But just, this isn't the three P's. It is, receive, what's going on? Is it a problem? Hey, look at that. That slides the problem. Look and think. Look, what's going on? Process, think, make, perform. Ship is also perform. Learn, look at what's going on. Process, plan again. And then the next perform is kinda meta because you go back in there and you do it again. And this model where you look at it, you think it, you make it, you ship it. We like to start with the planning first. But remember, we're stupid. No plan survives first contact with the enemy. Go out there and try it first. See what doesn't work when you throw it against the wall. Then learn, plan, and then ship it again. Lather, rinse, repeat. This is just an expansion of the three P model. We're never gonna remove human error. It's futile. You just can't do it. We can, however, mitigate them using the three P's and avoiding our hazardous attitudes. We really need to gut check ourselves and keep calm and keep it simple. And then ask ourselves, can I instantly understand this under distort level? I often say, if I can't open a file, if we're working with like configuration management or I'm working with the system and understand what it's doing in the too long didn't read version in like 10 seconds, I failed. If someone else can't understand it, I've also failed, especially if they have to spend like a half an hour looking and going, I don't know what's going on here. Get back. Does this problem that the solution of my problem violate any human limits? Is there more than seven piece of information? Three pieces of display. Does it require us to process more than four bits of information a second? This isn't actually as important in our field. It's more important in real time things where if you're trying to process more than four bits of information, you fly into a mountain. But keep it in mind. Somebody here might have that dealing with real time inputs. And everything happens for a reason. Sometimes the reason is your students make bad decisions because we need to evaluate our decisions. We got to check them for our hazardous attitudes. Every time I've screwed up on a project or I've screwed up and almost died, I can pretty much pinpoint it to either if it's not a perceptual or it's not a processing error, it's macho and vulnerability, resignation or anti authority. I thought I was better than everyone else. I thought I could get away with that or I'm like this always sucks so I'm just not gonna fix it and then three weeks later it breaks. These are the things. So when we're making these decisions, we're going on our little three P loop when we get to that last step go, am I being macho? Am I being invulnerable? Am I resigning myself to my fate and I do not need to? Am I being anti authority? Anti authority is a big one, especially in the culture we have. A lot of us that are my age kind of grew up in like hacker culture. I know that, well it's quite a while ago, I know that I stole credentials to a VMS system to get access to Unix-like environment and get on what was the internet before our good friend Al Gore invented it. So we're used to being anti authority. People telling us that can't be done and we do it anyways because that's where we are. That's what we do. So we're very prone to that error especially and if these are present in your decision, stop, loop back through your cycle and start over again. See what you can do to remove these really, really hazardous attitudes. Now we can go on. And there's not a silver bullet to reducing human error. There's not a solution to the problem. I'm not gonna go backstage and sell you my book or take you to my boot camp and take you how to eliminate all human error by replacing everyone with sentient machines. But if we perceive, if we process, if we perform, if we acknowledge that if we simplify, we can keep it closer to the limits of human memory processing and distraction. If we realize that procedural, perceptual and decision making errors happen, we can mitigate them if we're honest with ourselves. We can get to a good place, not a perfect place, not a silver bullet. But let's not let the perfect get in the way of the good because that's one of our greatest enemies. Thank you very much. Two minutes. Two questions. All right, I got time for two questions, they say, if anyone has any. Huh? At Jbrek, J-B-R-E-C-K on Twitter. Sorry, I'm not a big Twitterista but I check it after conferences. When I'm asked, yes. The big focus has been, they still like to write regulations but when they train a lot of the air transport pilots and even they do the FAA Wings program for continuing education credit, the big focus is on eliminating human error because no one can actually, and they've even started to be honest, no one can memorize all those regulations but you have to make generally good decisions because no one sits there and goes, oh, holy shit, I'm about to die because I didn't do 91 part 308 and 92 part 407. You die because you forgot to put enough gas in the plane. You die because there was a chain of errors that a mechanic forgot to screw in something on the engine but you could have caught that during pre-flight but you're too lazy because you're invulnerable because the plane always works, right? It's worked for the past 200 hours and I don't have enough time and I really need to get to that wedding I'm supposed to go to for my sister and all of a sudden the engine quits over the mountains and that pretty much always ends badly. It's that chain of errors. At any place you could have stopped that by cutting loose any of those attitudes and stopping that loop but you didn't and that's what they're really trying to pound in people's heads but yes, yes, that is what they attribute to that and we haven't found any contributing evidence in like 25 years that they're wrong. So maybe they'll come up with something better someday but that's pretty much it. People like to, only like 7% of aviation accidents with fatalities are a result of mechanical errors that could not have been seen where the plane drops out of the sky. The rest of it is people doing dumb stuff and just killing themselves or a bunch of other people. Like incandescently dumb things. That's a whole nother discussion. Catch me after in an open space and I can just tell you all sorts of horror stories that make you never want to fly again. Oh, that's it. Thank you guys very much and I think it's time to get going next time. All right, good job, Jess.