 Okay, so I'm here to talk about agile testing, a subject I really love. Agile testing is sort of relatively new and it kind of sprouted up after the agile methodologies and extreme programming kind of plowed down the field of all the existing testing ideas. So there were some provocateurs in the agile movement. You heard one this morning. You heard Brian. Brian Merrick was very instrumental in the agile testing movement and he would say things like, QA, we shouldn't follow the V model. That doesn't really work. We shouldn't go through, we shouldn't hand requirements documents to each other and communicate back with bugs. He's had some very provocative things as you can probably imagine. And then Kent Beck, who is kind of the master of calm statements. Kent Beck went to a conference that was a QA conference and he gave a speech to a whole assembled room of QA professionals and Kent said, extreme programming will make the independent tester obsolete. There's a provocative statement, okay? And James Bach, who is not in the agile movement but has his own context driven testing movement, has made plenty of provocative statements such as these manual scripted tests. They're not really the way to go. I think testers should just use their intelligence, look at the product, explore it and do exploratory testing. All these provocative statements kind of helped, like Brian was talking about, shake us out of some of our ideas about testing and some new ones have sprung up. And this has made I think testing a much more enjoyable and maybe more successful contribution to the team. Okay, yeah, well that came out well. All right. James Bach has a book called Lessons Learned in Software Testing where he lists about 293 give or take, one or two tips on testing. And the very first tip that he says is this metaphor, testers are the headlights of the project. When I first read that, it made me think of camping right away. I thought of camping. Okay, so my husband who is here, John, he likes to go camping. We used to go camping a lot. I think maybe I have prevailed now, but. So camping for John, it's not camping unless it involves a great deal of off-road driving, which is what this picture is supposed to indicate. And it's not camping unless you do that in your little Saturn four-door little black car. And it's not camping unless you do that in the middle of the night. So we did a lot of this and I learned to value those headlights very, very dearly. So as we'd be bumping along and kind of going over the rocks and I don't know if I had my eyes open or closed, but as John and I were supposedly steering this project, the headlights noticed they weren't steering the project, but they were providing the information that we needed. So even if we were arguing or not, we still had to have that information to tell us where those rocks are, where those trees are, how close are we to the cliff? Okay, the headlights are extremely valuable. I really do like this metaphor of James Box because it makes me feel very valuable as a tester on my projects. Okay, we wanna contribute, right? Okay, but some of you are probably thinking now, wait a minute, Kay, okay, this is great theory, but that's not the way it is on my project. On my project, I don't know that I feel like I'm contributing all that much. Okay, I've read the requirements document. Okay, it took me a while to do that. I read the requirements document and that's great. I don't feel like I've contributed. I've written up a whole bunch of test cases and test plans and test scripts. Okay, that's super, but I don't know yet if I've really contributed in the way that we're talking about headlights here. Actual testing, well, the code isn't ready for me to test yet, but Rick is often is cubicle busy coding away with his stubby little fingers and Rick the programmer is doing his very best to get me something to test. So when he does give me something to test, I'm really happy, I can contribute now, right? So I try it out and it throws an error, locks up, crashes windows and it melts my hard drive. Now, that was just the install, so. I have to kick it back to Rick, I'm sad to do it, but and there I am again, you know exactly what am I doing on this project? Okay, well how many of you out there are testers, QA testers? Okay, so quite a few, all right. Some of you may have been on this project. At least I have been on this project. And on projects like this, I don't feel so much like headlights. I feel like taillights, okay? More often than not as testers, if we're on a code and fix methodology project right, code and fix methodology, we're really good at telling the project where it's been. We're really good at saying you know, you made a wrong turn here. Okay, how much help is that, right? And sometimes our findings don't even matter because the product has to ship by the date. Bugs and all, you know bugs are no bugs, it has to ship by the date. So we're just telling people for their, you know, entertainment purposes, these are the bugs in your product. Okay, ship it anyway. Or if the release date does get delayed, you know, that's just the fact. We enter this unending cycle of find a bug, fix a bug, find a bug, fix a bug, find a bug, fix a bug. Regression test, oh, that introduced, the fix introduced some more bugs, find a bug, fix a bug, find a bug, fix a bug. And nobody on the project can say how long it will be before the product's ready to release. Yeah, at least of all me, the tester, you know, I can't tell you how long this cycle will go on because it's completely unpredictable. Well, what happens though, okay, so that's kind of bleak, but what happens when this project goes agile, right? Oh, wee, we're going agile, great, everything speeds up, right? So, things definitely do get better. Okay, now Rick, my stubby fingered programmer friend, he is time boxing his work and he's working iteratively. So, I'm getting builds a lot earlier, a lot earlier. And Rick is unit testing his code, right? He must have gone to Christian's talk and he's now writing his code, DDD. So, the code that I get, it basically works. It doesn't melt my hard drive. It doesn't hang windows. Well, it might hang windows Vista, but that's another story though. And the features basically work. So, things are good. However, I start to notice very soon that I'm having to test a lot more and I'm having to test a lot faster, okay? One of the reasons is because we're working iteratively, we're releasing more often. All these piles of tests that I need to do before release, I have to do it more often. I have to test faster and my delaying tactics, they don't work anymore, okay? Because Rick is over there, oh, and now he's pair programmed with Julie, so he's taught her to be just as productive as he is. So, I'm getting twice the amount of features now because I got productive programmers. And I can't just sit down in five minutes and find five critical bugs, throw it back to them, and then that buys me time to do my real testing. I can't do that anymore because the bugs are harder to find, okay? They're not so critical. So, my delaying tactics no longer work. You notice I am still the taillights. I'm coming along after Rick and Julie have coded. But now I'm the taillights and I'm not even keeping up with the car. And management does notice, okay? Management notices. And I can just imagine them sitting in their little management room, sitting around the table looking at each other and saying, testers aren't keeping up. What do we need? What we need here are faster taillights. So, if we could get these testers out, maybe to a conference and have them learn about agile testing, they would then be able to keep up, yeah, okay. Well, imagine for just a moment that there's a little bit more to agile testing than just faster taillights. That there's a little bit more to it than just doing the same thing that we always did before only faster and with no requirements. Well, there are some secrets to agile testing. I would love to tell you about them, but unfortunately I can't. Because the agile testing secret society might kill me if I reveal the secrets. You know, I'm kidding. I've been watching too much DaVinci code. There is no agile testing secret society and they are not recording this talk. Okay, that's my story, I'm sticking to it. Just in case, I would be a little careful and I'm gonna tell you stories instead. So if you accidentally pick up some secrets out of these stories, that's a bonus, but it's not my fault. First story I'm gonna tell you is about saber airline solutions. Saber airline solutions make software for airlines, right? If you've ever bought an airline ticket, how many of you have bought an airline ticket? I know your arms are tired, we're giving you exercise. If you've ever bought an airline ticket, your transaction has probably gone through at some point or another, saber systems. They write software for lots of airlines. And they have become very interested in agile methods and extreme programming. Well, this is an experience report that they gave at, I think it was agile 2007 in Washington, DC, about a project that they had that went terribly wrong and how they turned it around. So they do their work on contracts and airline will contract with them. There was a particular contract they really wanted or the sales guys really wanted, I don't know. And it was this key market, you know, it always is. It's this key customer, it always is, right? This is great, I mean, they wanted the project but Saber was so busy with other projects that they didn't have enough testers to pull it off. So what they did was they talked with this client, they said, look, we wanna do this with you and we wanna structure this contract so that you will send some of your people to help test. Well, they agreed to it, everyone was pretty happy about that. That seemed like a good idea, we'll work together. And they signed the contract and the Saber people went off and started coding, right? Okay, frantically coding, frantically coding, frantically coding. All right, after about two iterations, the client testers show up here where we are and the Saber people say, oh, you're all ready. Okay, well they escorted them to this beautiful, comfortable testing lab on the first floor of Saber headquarters, which I think is in Texas, and then the Saber team promptly retreated back up to their third floor cave and went back to frantically coding because now we have to have something to show these clients testers. So the client testers, they got rather frustrated. We've got nothing to see, we've been here, we've been sitting here for two more iterations and you're not even talking to us. So the sales guys must have heard about this. Anyway, there was panic at Saber, right? We don't want to lose this contract. They called an emergency meeting after they stopped yelling at each other and blaming each other, okay, what are we gonna do to solve this problem? They decided to bring those client testers up to the third floor, integrate them with the development team, work with them every day, and the client testers would write acceptance criteria for each of their user's stories that they had on their three by five guards. And they would write these acceptance criteria stating what that feature was supposed to do in more detail, in a testable amount of detail. They would do that before the feature was implemented. And this was the thing that they said turned their project around. So, some of the benefits that they got was they said in their experience report they had a 60% reduction in rework. They usually had this big testing squeeze at the end where development was late and testing with scurry, scurry, scurry and fine bugs, fixed bugs. And they had to pull people from all over Saber to curry come test because we're in crunch mode. They didn't have that on this project. They sailed through their testing at the end and the development team and the customer were both so happy that they requested to always work in this way whenever they did any project together. Other customers from Saber heard about it. Hey, we heard about it. We want to send people to, will you let us send people to Saber to help you test? So, other customers started to get into the act. And this was because they were specifying their features up front and they were talking to each other. You probably noticed. All right. Well, some of you may be thinking, again, you know, wait a minute there, okay. That sounds a lot like requirements. I mean, you're specifying things in advance, you know. That sounds too much like requirements. I thought agile projects don't do requirements. Well, here I am. If I burst anyone's bubble, I'm sorry, I'll make a provocative statement that agile projects do requirements. We just do them better. So the whole point of the agile manifesto, the very first sentence in the agile manifesto, we are uncovering better ways of developing software. So we in the agile community believe that acceptance criteria are a better way of doing requirements. You get lots of advantages from them. I think I'll skip over. But you heard the advantages that Saber got from them. But what's the downside? You have to talk to each other. Okay. So it's probably a good idea to be nice to each other. I don't know if you remember the movie, Bill and Ted, Excellent Adventure. Be excellent to each other. Especially, I think testers need to maybe work on this just because we're in this position where we're always criticizing, that's our job. So I think it's very important for testers to really work on being nice to people. And myself, personally, I just keep bringing bagels until people like me. Well, I think that acceptance criteria are an improvement over requirements. That's my opinion, but I'll give you three reasons why I think that's the case. Okay, so first thing is acceptance criteria are developed collaboratively. We talked about the requirements document. The requirements document is usually written by one person. And if you're really lucky, they might come talk around to people first before they write it. But acceptance criteria develop collaboratively. You can't get away with any of the BS because you have to work out the hard problems or you're not all gonna agree on the acceptance criteria. And they're also developed just in time, just when you need them. So that requirements document is probably created early in the project. By the time it gets to the testing phase, how many of those requirements are still relevant? Yeah, usually it's about maybe 25%, a little more if you're lucky. They're not even relevant. Acceptance criteria are always relevant because they are developed exactly when you need them. The third thing is that acceptance criteria are used directly for testing. So requirements document, even if you have one, even if it was created somewhat collaboratively, even if it's still relevant, you still translate those over to test cases and test objectives. Well, why do that, right? I mean, that's extra work. The person that was writing the requirements document probably wasn't thinking in detail about how you might test it. Acceptance criteria are used directly for testing, they are the test objectives. All right, well, I hope I've convinced you to at least look up acceptance criteria and learn a little more about them. There is another secret to agile testing, which of course I can't tell you that either, but I'll tell you another story. We'll work on that principle here. I'll tell you about two different projects you get to compare and contrast them. The first project I call, if it comes up, how to lie with burned down charts. This was a fairly large project and they had scoped it out and they figured it would take nine months. They knew they were gonna run into some challenges, so they said a year. They worked on two-week iterations, they met together every two weeks and looked at the pretty charts. Pretty charts, this one's very pink. Oh my gosh. All right, and you've seen burned down charts before probably, you could see how on the left side, you're measuring the number of features that are left to do before you release. And along the bottom is the timeline of your project. Now the straight line is your projection. Okay, if everything were perfect and we burned down features at this constant rate and we got them all done, that's when we could chip. And then the other line is their actual, which gets filled in as you go. So they look at this chart and, you know, I mean, there's some challenges, it's not perfect, but the features are burning down, right? So if the very end of that, the very end is the one year mark, do you think they made their one year mark? Or do you think they missed it by very much? Maybe it doesn't look too bad, but there's one thing I had neglected to tell you. They burned down their features on this chart based upon when development said the feature was coded. So these are not based upon QA testing the feature and saying that it works, or other people reviewing and saying it works according to how the customer wants it, okay? So every time you do that sort of thing, I don't know how many of you went to the evening social last night and heard Neal's talk, but he talked about metrics. Metrics said, meaningless metrics create what? Meaningless behavior, thank you Joel. Meaningless metrics create meaningless behavior. Here we are measuring four completed code, but not tested. So what are we gonna get a lot of? Code that's completed, but not tested. So what are we gonna get the end of our project? A big testing phase that, remember nobody knows how long that phase takes. It's a huge uncertainty. And so in fact on this project, this is loosely based, but it's based on a real project. They got done, they're coding done in a little less than a year, and their testing phase went off into the next room. Okay, it was one and a half years in testing before it released. That's what happens when you allow your risk to pile up. You just don't know. And so by giving this burn down chart, you are in effect lying to your management to say this is our progress. Okay, you don't wanna leave testing behind. Different project I was on had a whole team approach and this was their burn down chart. Oh, that looks the same, okay. They had a burn down chart too, and they had about a year long project too. And they made two week iterations too. But every two weeks they met together and developed acceptance criteria. Every two weeks in their iteration planning, they put two estimates on each story. One for development, one for QA. So they knew how much time it was gonna take them. They only took into the iteration as much as they could code and QA. So all of these marks here are actually representing running tested features. This project, I'll just tell you the ending, released on time to the day that they said at the beginning of the project. And one year project released on time to the day. And it had another advantage over the other case study is that this project had three reported customer bugs in its lifetime. So I can tell you the other project had many, many more than three bugs reported by the customers. Okay, we're moving right along. So those are, I have not told you. Remember I have not told you the two secrets of agile testing. The first secret being if you define features before they're built, then the testing after goes very well. The second secret being if you measure progress by running tested features, if you work as a whole team and help each other and don't let them get left behind, you'll be able to predict your schedule, testing gets the help it needs. Okay, this is a commitment on the part of the team. Cause I could tell you if you're water skiing and you fall, it doesn't matter how motivated you are or how much management tries to motivate you. You can't keep up with the speedboat without the speedboat coming back for you. You can't do it. I used to be a developer as Christian said and I've been a tester for quite a while now and I just wanna tell you how much I love testing. I really do and that's why I stayed with it and didn't go back to development. I really love testing and I love it because with agile testing, I'm actually contributing to the project. I am the headlights. I'm working on those acceptance criteria. In fact, usually I'm driving the acceptance criteria. I can help the project out in that way by offering my services to drive their acceptance criteria definition. Bring the people together and talk to each other, define the features before they are built. And then I also love agile testing because the teams don't let testing get left behind. And I actually have a top 10 list. How much time do I have? Okay, so let's see how this works. All right, I have a top 10 list. So I thought I love testing so much. Let's see if other people love agile testing and remember, I'm able to come up with this list because of all the work that those provocateurs did early on in our history. Cutting down the old testing ideas and letting the new ones sprout up. So let's see, what's the 10th reason, okay? Well, I will tell you the top 10 reasons, apparently. Okay, I don't know. I don't know where that came from. But the top 10 reasons are, okay, I have, oh, you can't see that. No more manual test scripts. Do we love manual test scripts as testers? No, I see some heads shaking. And, okay, number nine, I don't have to work in a cubicle. Okay, I really don't like working in a cubicle. I'd rather talk to people. Number eight, I'm allowed to think about stuff. I'm allowed to think about stuff. I'm allowed to learn stuff. All right, as a tester, I'm expected to be intelligent. Not some kind of rote machine going through test scripts. The developers, they don't hide when they see me coming. Okay, developers actually like me because I'm actually on their team. I'm helping them. Hmm? Yeah, I brought the, maybe that's what, maybe that's it. Food is always good. Okay, I end up get to investigate and solve complicated problems. I even get paid to do this. It's like solving mysteries every day. Okay, here's one from the Agile testing list that I particularly like. This I didn't come up with, but he did. I can impact quality instead of just documenting the lack of it. Customers actually like the product? How about that for satisfaction? No more death marches. Okay, I've been on death marches. Do we love death marches? No, I almost left software because of death marches. So now I don't have them very often. So I'm really happy about that. There's always time for testing because testing happens first. Yay, sick of getting squeezed out, all right. And which Brian actually mentioned this morning, which I thought was really interesting, but the number one reason that I love Agile testing is because I really do get to hear people say, this is the best project I've worked on in my life. And there's nothing like that to make me feel good as a member of that project. So I will leave you with that. Sorry about the slides, but they were pink anyway. I don't know where that deck came from. And I will take questions. Yes. How do you deal with the division of responsibility and validate your acceptance criteria between developers and planners now? The question is how do you deal with the division of responsibility validating the acceptance criteria between developers and testers? How do you share the load among developers and testers? Okay, well, the case study that I gave you that had the whole team approach that went back for testing, because they were jointly responsible for the stories, they were made responsible in the iteration planning because a developer and a tester would sign up for the story. So both the developer and tester would do whatever it took to get that story developed and tested in the iteration. So they could work out amongst themselves the division of labor, right? I mean, they'd start out coding and the tester would start out working on the tests. But if testing got behind, those developers would write tests and would help the testers and would do the testing. So the allocation of responsibility is a whole team approach. Other questions? Okay, we've got to stop. All right, thank you so much.