 My mom's in the audience back there. That's great. We need a little button here. There we go. All right, now we're in business. All right, well, thanks for sticking out this long. I know you're really just waiting to get the iPad drawing, but that's okay. I'm here to share with you a few secrets of the universe. Are you ready for that? Ready for some cool secrets? It'll be fun and air. This is the 10th year since the Agile Manifesto was written. It was 10 years ago that some 17 of us who were interested in such things got together in Snowbird, Utah and said, well, you know, this is how we think software development should be. And if you ever been to the Agile Manifesto site, there's this wonderful picture in the background. That is the back of my head. Right there. Proof that I was there. So since this is its anniversary year, I've been talking more about Agile Poly than I normally would, and I'm going to stop after this. I'm kind of done with that. But I wanted to just go over a couple things that I thought was interesting, looking back at my notes from the original session and what's happened in the last decade or so. And one thing that I really, really want to hammer home, and I try to do this in almost every talk, is the importance of context. The importance of context. Why things in context makes all the difference. Now what do I even mean by context? Well, if I say, draw me a picture of a tree, or think of a tree, you probably think of something like this. It's an object. It's a tree. It's sitting right on the ground. If you're in sixth grade, that's probably how you draw it. I still draw like that to this day. But that's a tree, right? No, that's not a tree. In systems thinking, you learn to look at things instead of this object as a set of interrelating systems. In this case, a tree is really several interrelating systems. You've got all this respiration business going on with the leaves and the air and photosynthesis and whatnot. And now here you've got all the roots with the Kreb cycle and nitrogen and all that stuff from biology that I didn't remember the first time. All these things going on, a little pristine object sitting there on the ground. It's all these messy interreacting systems. Guess what? That's how the world really works. Your software, your team, your organization. It's like this. It's messy. It's all connected. When you look at something, you have to look at it in context. You can't just take it out and set it like that tree, because then you lose the essence of what it is you're trying to look at. Now, on a more humorous note, context makes a big difference. For instance, it's normally okay if you just like are holding an axe unless you're hitchhiking. Then that's probably not such a great idea. Do you ever hear of this guy, Joshua Bell? I love the story. This fellow was a professional violinist, a famous violinist. He plays on a three and a half million-dollar Stradvarious. That's his axe. That's what he plays on. He normally plays places like Carnegie Hall, these sorts of big venues, $100 a seat just to get in to see him play. As part of an experiment back around 2007, he played in Cognito in the Washington subway. Pulled up a folding chair, put his hat out, and started playing on his three and a half million-dollar Stradvarious in the subway. All the old thousands, tens of thousands of people going by. Guess how much he made in the hat? Very close, 32 bucks. Seven people chipped in a total of 32 bucks to hear this master play a three and a half million-dollar violin because of the context. If he'd been in Carnegie Hall, it would have been different. But he's in the subway. They're thinking, well, how do I even judge it? Is this good music? Is he a good player? Is he just a crank? We usually see cranks in the subway. Maybe I'll just ignore it. And this makes a really big difference. Great things aren't just great by themselves. They can only be great in the proper context. So, for instance, if you want to introduce some new practice into your team, you want to get them to start doing whatever, a pair of programming or a TDD or something, well, you have to be kind of careful how you present it. Otherwise, you sound like the crank in the subway. The idea by itself isn't good enough. It's the context. It's the environment, the picture that you paint. So, getting back to some 10 years ago, I found this while voting around for some 10-year stuff. This was a picture, actually, of the television monitor at the hotel, February 2001, six in the morning, it was then called the Lightweight Methods Conference because we hadn't yet invented the term agile. That happened in the afternoon. So, we got together, you know, a bunch of folks who were interested in these things. None of the stuff was really new, particularly. If you go back, you know, pair programming goes back to the 50s. You know, easily. This is not a very new kind of an idea, and most of the ideas weren't particularly new of themselves. But the interesting thing I thought, I actually found this little notebook that I had taken to Snowbird and made notes on during the meeting and all the stuff that we talked about. And as I just looked over this year, it struck me, we didn't talk about practices. We didn't talk about pair programming. We didn't talk about TD. We didn't talk about refactoring. We didn't talk about version control. We didn't talk about any of that stuff. Now, it's just kind of curious because at the time, the adoption of these things was a big problem. I used to go to events like this, and I'd say, all right, tell me honestly, raise your hand if the project you're working on at work does not use version control. And about a third to maybe 40% of the audience would raise their hands sheepishly. But they'd raise their hands if they didn't even have any kind of version control in place. They'd have a big sharing disk. Everyone would mount it. You'd all write code and last one in wins. So, you know, here we are 10 years later. Fortunately, that's less of a concern. Now it's like, you know, raise your hand embarrassedly if you don't use Git. That's a little more subversion guys in the back. Corporate standards, sorry. And the guy who uses CVS is out in the hall and he won't even come in. But that's a lot better than it used to be. Back then, that wasn't the case. It wasn't the case. And yet, we didn't talk about practices. So what did we talk about? What did we mean when we came up with the term and came up with this idea of agile? Well, here's what we meant by it. We thought agile should be something that's ever shifting, ever changing, and ever responding. I didn't say nothing about hair programming or any of the other particular practices. It's talking about adapting to change. Kent Beck's book on XP, Embrace Change. I don't think anyone ever really got the subtitle. It's like, no, change is horrible. It's icky, we don't like it. Tell us what we have to do. And we go, hey, we got unit testing, we'll do that. We got TDD grape, we'll do that. Well, there's more to it than that. So in the course of, I guess it was the what-wear book, the Pragmatic Thinking and Learning book, I came across this great definition from a professor who was talking about the nature of expertise and the nature of learning and how we gain skills. And she came up with this thought and said, whatever practices you do, they can never be completely objectified or formalized because they must always ever be worked out anew in particular relationships and in real time. In other words, in context. Practices can never be completely objectified. You can't go in with the big book of practices and say, all right, we just do this and it will all turn out okay. It doesn't work that way. It has never worked out that way. Despite what folks at, you know, the CMM might like to think, there was that whole SWEBOC effort. Do you remember that? The Software Engineering Book of Knowledge. It sounds like something out of a Bad Blues Clues episode or something, right? But this was a big deal. Like IEEE was in on it. I think ACM, Carnegie Mellon, you know, maybe IBM, somebody like that, you know, big folks wanted to get down exactly what this says you can't do. They wanted to write down all the practices, do this, and, you know, it'll all work out. It doesn't happen that way. So that's one thing I think we've kind of forgotten in the last decade. The other thing is the fact that we're at the time, and I found this in my notes, Agile comes from real science, right? There's Beaker. There's real science. It comes from things like chaos theory, from Japanese idea of kaizen, from systems thinking, like we talked about with the tree, from better risk management, from looking for return on investment. This is all real stuff, and it actually works, and it's actually important, and I think we've lost sight of a lot of that. The first chaos theory is to me just an amazing thing. This is the closest bit of science that comes to black magic outside of dark matter, which I still think they just made up. But emergence is very cool. Have you ever seen the Boyd's simulation? This is absolutely marvelous. You look at the behavior of the flock here, and it's doing all this wonderful, interesting, complicated kind of maneuvers, none of which are programmed. The only rules that guide the motion of these particles are right here. Three rules. Separation, try to steer to avoid crowding, alignment, steer towards the average heading, and cohesion, steer to move towards the average position. That's it. Real birds do this, more or less, and you get this all kinds of nifty flocking behavior. You get avoidance of obstacles. You get it where they will split around an obstacle and reform. You get all this great emergent behavior that's not programmed in. It just emerges from the system. Emergence is where you get complex behavior from simple interactions. This was something we talked about. We talked about this a lot. This is why you want to have sort of one team, why you want to be co-located, why you want to keep things really easy and really simple. Because from those simple interactions, complexity emerges. That's a much better way of trying to do it than trying to do like, you know, the Swibock and get everything all written down in a nice little book up front, is obsolete as soon as you finish typing it. Kaizen. This Japanese idea of continuous improvement. This was a really great idea. We all screw up. I screw up a dozen times before breakfast, usually. Right? This is what we did. We're human. That's fine. But from each one, you have to learn from it and do it differently the next time. That's the big problem. We all make mistakes. We snooze stuff wrong. The team does stuff wrong. The corporation does stuff wrong. And they're like, oh, dang. You know, that was real bad. And then they do it again. The exact same way. And by the 15th time, it's like, you know, we really ought to do something about that. And, you know, they're like that night in the Monty Python thing, right? There you go, come back and fight! You know, it lost all their limbs by then. Continuous improvement. You know, always looking to see what you can do better, how you can fix it, how you can learn from failure. Which gets us to this idea, key to Angel, of having continuous changes to your code, your process, based on continuous feedback. Not one, you know, single post-mortem meeting at the end of the project. What do you do with that knowledge? It's too late. Right? A big project retrospective. Yeah, great. The project's done. That's not going to help. Do a retrospective each iteration. Each day. When you need to. When you can get continuous feedback and do something about it. Looking at risk. We really like the idea, and we talked a lot about, avoiding actual risk. So, where is the actual risk in programming? What's from writing code? Right? So, write less code. If you don't write code, it's got no bugs in it. This is an easy idea. I really like not writing code. This is my goal in life. I want to get as much not as possible by writing as little code as I can. Because everything you put in there is just, it's like a Petri dish. It's a growing ground for bugs. So, it's like, you've got this giant thing that you know, maybe you're going to use it, maybe you're not, but you put it in there anyhow just in case you need it. Right? And it's just sitting there breathing. It's like one of those Lysol commercials with these things just growing under the sink. So, write less code. We always pitch, back in the original pragmatic programmer book that we wrote back at the turn of the century, that doesn't sound horrible. That just sounds awful. But back at the turn of the century, we really advocated a lot of these ideas of reversible programming, defensive programming, and that still holds today. Just because you have unit tests, what a big deal. That's not going to save you. If all your unit tests are testing the one happy path that you wrote through the code, that doesn't count when you hit the real world where disks fry and networks blow up and users do stupid things, and programmers do stupid things, and companies do stupid things. So, that's not enough to save you. You've got to go back to this idea of defensive program, reversibility. You know, you make some big decision to your code. That's great. What if it's wrong? What's your plan B? How are you going to reverse it? How are you going to back it out? Do you have a facade in front of it? Is it insulated somehow? What's your plan? What are you going to do to get around it when the whole thing goes, it's up as they say in English. These things happen. Now, bad as all that is, here's the real killer. We are so excited about this Agile idea. It's like, hey, this is great. You know, we do this continuous feedback thing and we'll look at risk pattern and we'll do all this and this is wonderful. But there's one thing we didn't really kind of clue in at the time. And that's the fact that as human beings we're really not wired for this kind of behavior. You know, the why Johnny can't be Agile. There are cognitive reasons that we don't do Agile particularly well. And one of them is, you know, we're not spot. That's the first thing. But we have this notion that we sort of should be. And we're so used to programming the computer, working on the computer. We tend to get this idea that humans are logical. That we're rational in any way. And of course, it's much more like this. That's a much more accurate picture of human thought processes in action. If you look on Wikipedia, don't do it now. If you look on Wikipedia, under cognitive biases there's something like 90 or 100 common cognitive biases that screw up how you think, how you make judgments, how you perceive things. 90 common ones. I met some folks who have way more than just that. Those are the common ones. And these, unfortunately, there's a handful of these that really affect Agile adoption and trying to use these sorts of methods. The first bad one is the sort of need for closure. And there's this great Dilbert cartoon where Dilbert says, I didn't have any accurate numbers, so I just made this one up. Studies have shown that accurate numbers aren't any more useful than the ones you make up. How many studies showed that? 87. The funny thing is, most people, many people, most people really do prefer that. They would rather you make up a completely fake number just so they can have closure than leaving it open and undecided. For some people, they just cannot stand leaving something undecided. So, you know, the company, the client says, well, when will you be done with this project? Well, we don't know. Honestly, we don't know. We've never done this kind of thing before. We've got 60 people on the team. It's a new technology. It's a new version of Rails. You know, we don't know. Yes, but we need a number. Okay, 87. Fine. There's this driving need for closure which makes things like big design up front very appealing. Look, we've got a book. It's in a binder. It's got this design in it. It's total crap, but it's in a book. It's right here. Isn't that great? We've got closure. And this, you know, fundamental psychological need drives a lot of that kind of just telling. Even if it's wrong, just tell me. I gotta have something to hang on to here. Well, of course, the reality is uncertainty leaves your choices open. And we like to be as uncertain as we can. Post-foam all the major decisions as long as you can. There's a graph I showed in one of these things that you think about your knowledge over time on the project, right? When do you know the least about the project? Day one. When do you know the most about it? At the end. When do you want to make all your big decisions? Beginning? Or closer to the end? You know, simple stuff. But we forget about this in the heat of the moment often. Another thing that we do just as people, when somebody does something bad on the project, it's because of who they are. It's because they're Ukrainian. Because they're short. Because they're tall. Because they're wide. Because they're whatever. You want to pin them on it. And this is called the fundamental attribution error. Thinking that people are the way they are, just they're born that way. Well, in fact, what it really is, everyone's reaction is because of context. Yet again, what's the context? Maybe they had a bad morning. Maybe they had a bad upbringing. Whatever. These things happen. Now if it's you and you make a bad mistake, oh, it's easy to excuse. I had a bad morning. My mother didn't love me. I was abandoned on Tatooine. Whatever. These things happen. But when somebody else is like, now they're just born evil. That QA person who screwed us over and gave us all these bugs back, they're born evil. You know, some Satan right there is just done. Not true, not helpful. You got to look at the context and see, you know, somebody is being a real creep who is a client. Well, why? They likely weren't born that way and they're probably not having a fun time with it. You know, get to the root of it and find out what happened. Another good one that causes people and groups to hang on to technology when they shouldn't, is post-purchase rationalization. And we've seen this, you know, people go out and spend a bucket of money on some fool of a tool because some consultant told them they needed it and by God, you peons are going to use it now because we just spent a million bucks on this stupid thing. Never mind no one can configure it or if it's like Lotus Notes, you need 14 administrators for every user. However that works, we've paid the money. We're going to stick with it, you know, till the bitter end, which is right around the corner. But this is, I mean, this is funny and true on a macro scale, but this is true personally too. If you had invested a lot of time and say something deplorable like Java maybe, you know, and there are people out there, you know, this was their career. They introduced themselves, I'm a Java programmer. They've got all the certifications and all this and all that and all that and I'm sorry son, who, you know, how long do you hang on to something you really shouldn't anymore? Is rail still the right thing? Is anything still the right thing? Hello. Yeah, yeah, yeah. Speed and heresy just starts cutting out. I know this is rigged. But the thing, the point is, just because you're a huge investment in something, whether it's money or time, and if it's time to let go, look out, move on. Don't stick to something just because you sunk a lot of time or energy into it, even if it's code, right? What's the best thing for code quality? A magnet on the hard disk. That is the best thing to improve your code quality. Throw it out, start from scratch. That is always, I mean, stunningly, that is always the best idea. Just kill it and start it over. We don't like to do that. We hang on to old code long past its expiration date, right? What do people do? Well, this doesn't work. Let me comment it out. We take little bits here and do this and it's like, no, delete it, frag it, waste it, get rid of it, start over. And the similar kind of thing is if you're used to seeing a lot of something, you see something over and over again, it's easy to confuse familiar with good. So, some things you might see an awful lot of that has no gaming value whatsoever. But somehow it seems kind of acceptable because, well, I've seen a lot of it. You know, she must be famous for something. She's famous for appearing in a sex tape with some R&B singer. Kim Kardashian. I had Paris Helen, but I thought that was a dated image, so I figured I'd update it. Because every time I turn on the CNN, they talk about Kim Kardashian in her wedding from... Who cares? It's pointless. It's pointless. There's no redeeming quality. And yet, this becomes easier to accept because, oh, it's all over the place. Therefore, it must be acceptable. It must be okay. Right? Java must be okay because it's everywhere. Flu germs are everywhere. That doesn't really make them... You know, particularly any good. Loss of verse. This is a similar kind of thing. Do you know about the South American monkey trap? Yeah, a couple of folks. This is a great thing. If you wanted to catch a monkey, you dig a hole in the ground, sort of like this, with a larger space at the bottom. You throw some bananas in. Monkey goes down, grabs the bananas. He's stuck. He can't pull his arms out. All he has to do to get free is let go of the bananas. But they don't. They like the bananas. They hold on to the bananas, and they're sitting there like this, and you just come by and pick them up. That's how you catch a monkey. We are wired to be loss of verse. We do anything to avoid the risk of losing what we've already got, whether it's that certification you were in, with some development language, some IDE, some operating system. You know, this brought us still a few OS2 diarts out there somewhere. OS2 rules. Yeah. It shined and she'll act somewhere. They're out there. But the thing is, you know, even though we're wired to be loss of verse, you don't want to end up like the monkey. All right? Let go of that code. Let go of that method you used to use. Let go of the normal way you do stuff, whatever it might be. This really should be called... This is two different ones. Planning, fallacy, and optimism bias. This is a... Besides being an innate programmer talent, this is wired when you have a tendency to underestimate task completion times, and a tendency to be over-optimistic about the outcome of planned actions. Now, as programmers, we have this in spades. And I love this far-sight cartoon where it says, it's time we face reality. We're not exactly rocket scientists. But, yeah, estimation... Yeah, I suck at it. I suck at it on the weekends. It's like, all right, I'm going to get to that plumbing project. I'm going to do this and we're going to get... It doesn't happen. You get eight things on your list if I get to half of the first one. Yeah, maybe that was a good weekend. But, undaunted, you know, next week comes up, yeah, we're going to do these eight things. It's not going to happen. But we're wired that way. Hawthorne effect. This you've probably heard of. The tendency, people, will change their behaviors when they know they are being watched. All right, so the consultant's come in, boss comes in, whatever, some famous person comes in. Everything's spiffy. Everything is shiny because somebody's watching. And then they leave and it all slides back to normal again. We're wired that way. This is my favorite one. Second-ordering competence or lack of metacognitive ability. This is referred to in the psychological literature as the lemon juice man. The guy goes into a bank. That's a true story. This guy goes into a bank, robs the bank, broad daylight, takes the cash in the bag, leaves, goes home. Cops come to his door and rest him. He's like, how did you find me? I had the perfect getaway. How did you find me? They're like, dude, we saw your picture on the security camera. He said, you couldn't have. I was wearing the juice. And they said, what? He said, you couldn't. Can you hear me now? You couldn't find me. I was wearing the juice. He says, everyone knows. If you put lemon juice on your face, you're invisible to security cameras. Not so much. Second-ordering competence is where you don't know what you don't know. He didn't even think to check or maybe run a test. Here's a great campaign for test first. Or you get thrown in the hoose cow. But the problem is, as a result of this phenomenon, less skilled people will have a tendency to overrate their abilities. So that new person on the team, fresh in, doesn't know anything, they think they got it sussed. They'll have this whole thing done by the weekend. The expert who's been doing this for 10, 20 years is like, I think we're screwed. Excellent work. And in fact, they all say it. I'm beginning to believe it now. The true sign of an expert is when you realize you don't know anything about anything, then you've made it. I'm almost there. Working on it. But again, this is how we're wired. It's just how it is. So with all of this, let me just throw out a few unangile warning signs that can creep up. Things like sloganization. Where you take some word and you beat it like a dead horse. Like TDD. Or agile itself. This is no joke. I had a friend email me last Thursday. Wanted me to come talk to his company. They want to do brace yourselves. Agile accounting. Agile finance. Agile sales. Agile marketing. Can't wait to see that one. They've had such success with Agile in their development organization. They want the entire rest of the organization. They want maintenance. They want the gals who clean the restrooms to be agile. Okay, there I can kind of see it. You know, devaluing a word, overusing a word to the point where it just becomes a slogan. It ends up like that company that had the big sign in the cafeteria that just said quality. Then Joke became, oh yes, here at big corp where quality is a word on the wall. Because that's all it was. Following rules versus exercising judgment. Sometimes you need to do both. Everything checked into source control. Where's your control? Lots of rules. You really want to follow that. You don't want to be breaking that. Some of the others? You probably want to exercise judgment. That's what being an expert is all about. Exercising judgment. If you don't let experts on your team use their judgment, you're not getting the value of their expertise. Confusing the model with reality. This is one of my favorite ones. If you have so many lies, you need a spreadsheet to keep track of them. Then you're in trouble. That's just the model. That's not reality. Demanding conformity. We must all use VI. We must all use Emacs. We must all use TechSmate. That's okay. Whatever it might be. And over-simplifying complex thinking that's what we're talking about. Over-simplifying complex situations. This one kills me. Anytime you get somebody writing an article or a speaker up here or a consultant and they say, oh, it's really simple. All you have to do is X for any value of X. They're lying through their teeth. You never, ever get to the point where all you have to do is X. It just doesn't work that way. They're leaving out nuances. They're leaving out the important part. So, how do any of us get past the advanced beginner stage of agile or anything else? What do I mean by advanced beginner? Glad you asked. So, back in the 70s, the brothers Dreyfus, Hubert and Stuart wanted to write some A.I. software not like Siri. This was the old days. They wanted to have software that learned skills the way people learned skills. They had no idea how people learned skills. So the first part of the research was well, let's figure out how people learn, how people attain skills. And they did studies on airline pilots and chess players, all these kinds of things where you could fairly easily quantify expertise. You can do metrics on pilots. You can look at chess masters and you can grade them with some relative ease to say how skilled they are. So they came up with this model of five levels, going from novice to advanced beginner to a competent practitioner to proficient and finally expert. Now what they also discovered is for most people, at most tasks you can get out of the novice stage pretty easily. You learn enough to kind of get around and you get to this advanced beginner stage and then you're stuck. You never bother going past that because probably you don't need to. This model is per skill in your life. It's not per person. You're not a novice or advanced beginner or an expert. You're a novice chef or a proficient programmer or an advanced beginner skydiver or whatever it might be. But most people get here and get stuck. This is sort of relative population here. If you're more make the big jump to competent, fewer stills proficient and bloody few experts in any given field that runs about one percent, maybe five percent somewhere down in there. Now some interesting things happen as you go along this ladder from novice to expert. You begin with a reliance on rules. Context-free rules are the only way you can get by as a beginner. This is why when you call up the call center and they're just going through that call sheet they have no idea what's wrong with your thing. They're just going through the rules. Ask them if it's plugged in. Ask them if they have this. Ask them if they have that. When you're an expert, you get really frustrated by that. You've got a blue screen of death from some IO request thing that you plugged in and you call up and they go, is it plugged in, sir? We've been there. It's frustrating as hell. Experts don't even look at the rules anymore. They rely on intuition. In fact, if you force an expert to follow the rules, it degrades their performance to that of a novice. They prove that with the airline pilots, which is sort of interesting. There's a corporate mandate that sort of hamstrings the experts and there's a lot of rules. You have to do this. You have to do that. You're basically hamstringing them and reducing their performance down to here, which is something we'd like to try to avoid. The other thing that changes here is you novice considers everything whereas an expert knows just what to look at. And this happens a lot. You tell people, well, test everything that can possibly break. Well, to a novice, everything can break. So they're testing print statements. They're testing accessors and getters and setters because hell, you can all break. I don't know. Whereas the experts like, I'm not even going to look at that stuff. That stuff's trivial. Now, over here where I'm doing this weird funky thing with the race condition and the blah, okay, this is tricky. I know there's going to be problems here. I'm going to test this out of this. The novice can't tell the difference. The expert, the novice can't tell the difference. They don't know where to put their focus. So, with all that as a quick intro, how on earth do you actually get to really be an agile practitioner? Well, the first thing is you need the agile mindset. When I wrote practices of an agile developer with end consumer money, there was no definition even. There still isn't really a definition of what agile is. So we had to make one up for the book. I made this one up. It was kind of reasonable. It says, agile development uses feedback to make constant adjustments in a highly collaborative environment. There you go. That's what you can write down. That's a good one. That's really what you want to aim for. You need feedback. You need to constantly adjust and you need an environment where you're collaborating with everyone. Client with your fellow developers, there are resources involved. High collaboration, feedback, constant, constant adjustments, daily, hourly, that's worth it. Not everyone can do this. There's this must list I came up with. These are the things you have to be able to do. All of your development and management practices have to generate continuous, meaningful feedback. If you're not getting feedback from unit tests, if you're not getting feedback from meetings because you all just gather around and eat donuts, that's your good, but you need feedback. If you're not getting feedback from it, how can you make any changes for anything that the team got created? You need to be able to evaluate that feedback appropriately in context. Well, Mary screwed up the build over the weekend. It's her fault. Well, is it really? What do the rest of the team do that let that happen? What are you missing? Now you know what's wrong. You need to be able to do something about it. You have to be able to change your code, your process, your team, your interface for the rest of the organization and to the customer in order to respond to the feedback. There is nothing more soul-sucking or debilitating than figuring out what's wrong, knowing what you need to do to fix it and not being able to or rather not being allowed to for whatever reason. So, if you can't do these things, don't even try. Because you can't be successful without it. So, if you can do this stuff, then go back to the Agile Manifesto where we said these, you know, value the individuals and interactions, value the working software, value customer collaboration, value responding to change, all that stuff. If you can honor that and have that kind of environment, then you've got it made. You've got the environment, you've got that definition. Adhere to those values. Boom! You're Agile. You keep trying. You've got to keep failing. That's another interesting idea. If you're not failing often, then you're not trying enough. You're not trying enough hard stuff. You're not advancing. You need to be failing. You need to learn from that failure. Don't just keep stubbing your toe and then repeat. And that's all there is to it. Is that the end of Agile? We said all this ten years ago. Nobody believed us particularly. They're just like, oh, we must pair. We must do unit tests. Now we're going to Agile. We're going to Agile accounting now. Dying to see how that works out. So, the real problem as I see it is we said all this stuff ten years ago and we fully expected by 2002-2003 there'd be a hundred new methodologies out there that people would take and say, okay, well, we took XP and we changed it and now it's like this. And we took Scrum and we added this and we did that and that hasn't really happened. You've got a couple of main methodologies. You've got a handful of new practices. Things like planning poker were introduced later. It's a neat idea or whatever. You get some ideas from Lean coming in and Kanban. But for the most part, we're talking handfuls of new ideas. Not thousands. Just a couple here and there. So, what is it we're missing? Glad you're asking. When you do stand-up theater improv, apparently there's two rules. You have to agree and you have to add. So, what is improv? This is you get a couple of actors together and there's no script. We don't know who the characters are. We don't know where the plot if there is one is going to go. On stage in pressure environments say, okay, perform. Act. Make it up as you go along. Make it up as you go along. That has always been our mantra. There is no script. We don't know how it's going to turn out. We don't know who's playing what role. So, it's actually very similar. So, in improv, to make this work you have these two rules. You can't start off. Imagine this. The first character comes up and says, wow, I can't believe we made it and then the second character says, no, we're not on the moon. Thanks. Okay, now I'm screwed. We're going to go from there. Or, worse, you say, boy, I can't believe we made it here to the moon. The other character says, yep. Now what? I'm just as stuffed. So, you actually have to do both. You have to agree and you have to add something. Yes. Boy, I agree with you. We're here on the moon and wow, did you see that shadow move over there? Okay, now let's get a little interesting. Now I've got something to work with and on and on it goes. You agree with the direction that's being taken explicitly or not and you add your own bit to carry it forward. That's the part we've been missing. We haven't been adding our own little thing to carry it forward. So, yes. We'll do pair programming. Yes, we'll do unit testing. Yes, we'll do TDD. Yes, we'll do whatever else. We'll do active refactoring. We'll do whatever you want and we're going to do this and we're going to do this because we need it and we're going to stop doing that because even though every article says it's a good idea it is killing us dead. So, we're not going to do that anymore. We'll do something different and we're going to get feedback from what we're doing differently and if that's not working, we're going to change it. We're going to continuously change it until it works. What it's all about. Simple, straightforward. We said it ten years ago. I'm saying it again today. I'll probably say it again ten years from now because we still won't probably get it. That's the basic idea. So, that's really all I came to talk about. Those are a couple of the books that I wrote. There's my Twitter handle. Please follow me. The other guy said, well, if you're not interested in my stuff, don't follow me. Follow me. It's always cool stuff. You'll love it, I promise. You'll get every penny back that you invested in it. And there's my e-mail address. I do get thousands of e-mails, but you know, hey, Pylon, that's fine. I can take it. If you have anything that comes up later, you have a question or comment, what not, please do just follow up. On separate news, at the Pragmatic programmers, we are actually looking for a skill of Rails programmer who does some sysadmin as well. If you're curious, go to prideprog.com slash help dash wanted and explains it in glory detail. If you know anyone who you think would be a good fit, even if you're not, please spread the word. We sent this out on e-mail and Twitter.