 Namaste. I'm Johannes Broadwell and I'm here at Agile India with Todd Little and Todd gave three talks at the conference this year. I watched your talk on the first day, Todd, so could you summarize what it's about? So it was our experience at Halliburton and Landmark in dealing with offshore teams and some of the challenges and successes we had in dealing with some offshore teams, some of the activities we specifically went through to try to make it work well, some of the rethinking that we had to go through, some of the challenges that we had in particular working with India as well as some of the successes we've had in working with India. This actual particular experience that we had was actually pulled back from India because we couldn't find domain expertise that we needed. So we worked finding the right talent first and then once we found the right talent we made some great successes with some companies in Romania and Vietnam. Alright, so I guess that the first question when it comes to going with global teams is why would you want to do it? Well, a couple of reasons. One, obviously there's a cost advantage that we were looking at and that was one of the drivers. The other was though that we couldn't find the talent. We were having difficulty finding the talent globally, so I'm in the petroleum engineering business and so one of the things we needed was access to petroleum engineers. The petroleum industry has been booming. This particular time we were doing this work was in the late 2000s and it was very difficult to find petroleum engineers anywhere in the world. They were being snapped up all over the place. That was actually the reason we had to pull it back from India. We kept trying to get petroleum engineers in India. I think we'd get them and then they'd be snapped up in some other organization. It just didn't work, so we had to go find some in another location. Romania happened to be a place that had it. We had a good connection to a former petroleum engineering professor and he was able to pull in some of his contacts from the university. He had a software company and pulled them together and they were a very good talent for us to work with. So here's some of the lessons from Adria Manifesto, but it's about the individuals and interactions. Oh, absolutely. It was about the individuals' interactions. It was about creating autonomous teams that could do the work. One of the reasons we felt we really had to have the domain experience, not just software experience, was they needed to understand the business and they needed to be able to communicate at a high-rich level. So our team in Houston would really speak the language of petroleum engineering to the team in Romania and then we had a team in Vietnam which was doing automation testing and they could be speaking in language of automation testing and so it really was a decomposition and those people could work autonomously, deliver results and it worked out very well for us. We had significant improvement in our overall test coverage. We were able to reduce our overall defects that we found in beta significantly and then by the time we actually shipped we were able to capture almost all the beta defects, almost like a 97% reduction in defects and known defects that we shipped with. So how did you get that reduction in defects? Part of it was looking at our overall test strategy as well so the entire team looked at our test strategy and figured out where we needed to fill in the gaps and then we also then, based on the extra work we had to do, we leveraged these offshore teams, built in additional test automation, working with a test automation specialist, one of the challenges we had being in such a specialized domain, most of our testers were focused on petroleum engineering domain, they weren't automation specialists. So we went and looked for a talent that had automation specialty and that would happen to be a company in Vietnam. So that's how we were able, the estimate of it is we needed talent, why not, why look just locally, let's look globally. So you had specific need for very specific talent and you found it wherever you could. We found it wherever we could and then we worked to see what we could do to think globally, to optimize it globally. One of the things we had was initially everyone was complaining because the build time was at a, the master build was at a specific time, we had multiple builds, you know, semi continuous integration but the master build was a specific time. Based on that time, the team in Vietnam couldn't actually finish their work in time. So we said, what if we reinvestigate how we're doing, let's look at our times, we changed the build time, by that way the Vietnam team could download, have the build available to them in their morning, they could run the automation test during their day by the time they were done, the Romanian team would actually have the petroleum engineers that would investigate if there were any false positives, actually do the engineering analysis of the test results and then by the morning time in Houston, the Houston team would know is this a decent build or not. So we turned a problem which was a time shift into an advantage. Right, so the code would ship around the world. The code shipped around the world, but I think in the early days people thought about code shifting around the world from a factory perspective. Right. That I do development and then they do development, they develop, it was different. Each one had their own compartment that was actually made sense for their skills that they had. I found that there's a similar effect when I worked closely with a team from Europe to Asia, that we have three or four hours overlap. That was actually a huge help for us too. That is almost perfect because then you have some time for yourself and you have some time together. Yeah, that actually worked to our advantage as well, having two locations work because the team in Romania was the intermediary that had the overlap. We had almost no overlap with the Vietnamese team. It's almost a 12-hour difference between Houston and Vietnam. Vietnam and Romania was four hours. Right. And then between Romania it was eight hours to Houston. It actually worked out very well because the Romanian team could be sort of the central team that could be the intermediary and then that worked out to our advantage. So did you talk about other examples of teams as well in your talk? We had other issues. I went through some of the challenges that we had in outsourcing. One of the big challenges that we had being in a very specialized industry, patrolling engineering, we were a lot heavily on customer data. Customers are very sensitive about their data. Right. And so they don't like that data getting in the hands. It really didn't even want us to have it, but they were our partners so they let us work with it and then they felt good about that. But they didn't really like getting outside our building. So that was a big challenge. So one of the issues we had to work with is realize that's going to be a challenge. So we had to create some synthetic data that tried to synthesize many of the types of problems we were seeing, but that was our test data that we would use for remote teams. I think that's an issue that many people face in many industries. I've talked to many other industries. Banking has sensitive data, pharmaceuticals as well. Each one has a little bit different nuance of the same type of problem. So how would you summarize the keys to success? A couple of keys to success. One is I think that you need to set the offshore teams up to have autonomy. And that was a big part of the talk. I went into the issues of how to create autonomy and how to create ownership and making sure that the teams are aware of purpose and have all of the capabilities that they need in order to do that. In particular, like our Romanian team had petroleum engineering expertise that helped them be able to deliver on that. They didn't have to ask us a lot of questions. They could make all the micro decisions that they're making on a daily basis. They didn't have to wait for us to give them direction. And even if they made a wrong decision, we could then talk about it and fix it. So I think that's a key part is autonomy. I think another part is transparency and honesty. That was perhaps a little controversial because I threw it out there. Some of the challenges we had dealing with some of our Indian vendors where we don't always get with the cultural issues of hierarchy and some of the challenges of whom I'm afraid of telling the truth sometimes. So that it doesn't always come out as direct as it could be. I encouraged them to really, and I encouraged my teams to be just frank and honest. And some things I tell my local teams that if they're not getting the code quality they want, they need to go back to the remote teams and tell them that. And I'm telling my remote teams that when they're really mature, they'll be telling my local teams that their code isn't good and they'll get the quality. So those are some of the challenges. There's also the challenge of oftentimes people are afraid of their jobs. What I call the xenophobia effect. They're afraid of dealing with foreigners. So that would be more of an issue in the United States. Probably, although you never know. Maybe the Romanians are afraid of the Vietnamese or who knows. My view, I'm a global citizen. I deal with everybody. And it's about working as a global talent, working with people I enjoy meeting people from all over the place. So your second talk was on one topic that I find to be quite interesting myself, which was estimation. Estimation, yes, absolutely. And the title was myth-busting. Myth-busting, yes. There's a TV program in the U.S. that some physicists that go together and they create this... Actually, they're special effects guys from the movie industry. Awesome, yeah, they're great guys, yeah. And they take various myths in the industry and take a look at whether those myths are real myths or are they just myths? So what are the myths of estimation? So many of the myths. I think one of the biggest myths is the cognitive dissonance between the reality of software estimation and what many people expect estimation to be capable of doing. Many people believe that estimation should be very easy, that it should be very precise. The reality is that particularly early in projects the range of potential error in estimation is huge. Very frequently, I've looked at a lot of data very frequently and this has been repeatedly reported in journals. It's not unusual, it's in fact fairly common to have ranges from one to four between 10% to 90% probability of success. So way higher than what most management would expect. But doesn't it even out? So you estimate one thing higher and one thing lower? Well, the problem is that one to four it turns out that it's a log normal and log normal ends up having long tails which means that if I'm at... Well, first of all, the other side of it is that we tend to be horrendously optimistic in this industry. So while it's a one to four, we typically estimate the one. Which means the mean is a two and it means the median is even a little more than two because of these long tails. So there's some really dangerous zones in terms of what people look at. People know that scope creep is likely but they don't estimate, they don't include it. They basically work from a wishful thinking model. So the first myth, I guess, was around the feasibility of predicting. The first myth is around what's realistic even in terms of the range. Does general management and even the team understand the limitations of their estimates? I think it's almost every general manager, I've come across things that people should be able to estimate within plus or minus 25%. But they're not. It just doesn't happen. The only way it happens is by some sort of mandate or by constraining other things. You can do it. There are ways to get estimates to come out right but it's usually by manipulating other things in order to get there. Are those good things to manipulate? Maybe it is a matter of adjusting scope and working in order to get that happen. What's not good is by cutting scope that actually reduces value substantially or by cutting quality which also reduces value substantially. But it does sound like if there's one thing that you can do in order to keep your estimate, it is to cut scope. Usually that's the thing that's the most appropriate. You just don't want to cut scope that's fundamental to delivery of value. Sometimes we, I think, even in the agile community say, well, they've given so we're going to cut scope in order to deliver that. That may not be the right answer. It's a huge example I give from the Ford Taurus. The Ford Taurus, the first release of it, was hugely successful. But they did all the right things. They did all the customer surveys. They did all the interactions. They did multiple iterations in order to make sure that they were delivering the product they wanted. But they were six months late. They were six months late. Hugely successful product. Saved the company. Made lots of money for Ford. Fired the project manager. Second rev of the Ford Taurus. Project manager made sure he shipped on time. Cut corners every place he could because he was going to make sure it cut on time. No user surveys, no multiple iterations. Everything cut to make sure he met the time. Not very successful. Success is about more than matching the budget. Well, it's rare. We made such a big deal in our industry about it and we need to match the budget or meet the time. We're stupid. If that's how we're operating, it's stupid. I think I heard that Sydney of Opera House was about four times cost overrun or something like that. Wouldn't be surprised, yeah. But I mean, if you look at the impact. Beautiful impact, yeah. The big dig in Boston was huge overruns. But you want to get it right if you're going to do it, right? Maybe it was a mistake doing it in the first place, but if you're going to do it, you better make sure you get it right. So what are some more myths about estimation? One of the myths, it's sort of a myth, sort of not a myth. Relative estimation is, we in the agile community, there's a lot of people in the agile community say that relative estimation is so much better. It turns out that relative estimation is no worse or better than linear estimation. The one thing that is true, though, is that velocity is a great tool for correcting the biases that we have in others. Both relative estimation and just linear estimation against some other norm are both subject to anchoring problems, which is that once I have a number of them and I do something relative to that, I'm biased by what I'm doing it relative to. So both of them have the same type of problem. The important thing is that velocity can correct for some of those biases. And in what we do relative estimation, we almost always do velocity. The problem is that people that don't do relative estimation maybe just do hours or ideal days or something like that, you step with this concept of ideal day, which has its own connotation. So when someone hears it, they're not willing to correct an ideal day divided by velocity to turn it into some other day so the units get messed up. So just for audience who might not know it, how do you do relative estimation? So relative estimation is choosing some base story and saying this is a two. This is some story point number that I'm going to use as two. And then I'm going to estimate all my stories relative to that number. It's fine. It's an okay answer. But I think there's many in the agile community who say that we're particularly good at doing that when the evidence shows that we're actually not any particularly better at doing that than we are not having that relative perspective. I looked over some of my previous projects and I found that everything was a three of five or an eight which is within the range of uncertainty that you have anyway. Correct, yes. And I found that the threes often turn out to be bigger than some of the eight. So I realized that in this project it wouldn't have made a difference if I had estimated it at all. Correct. And so the basis of my final one, my final myth was are we wasting time doing estimation at all? And I consider that to be plausible. I think many people spend way too much time and get so wrapped up in doing estimation and they lose principle as to why are they trying to estimate and I think the first thing you always have to do no matter what you're doing is ask why. Yes. So why am I estimating? Am I estimating? Because the customer asks you to. Maybe it is and in which case you have to decide am I doing it for the customer and they have to do it. But does the customer even know why they want it? Probably not. Probably not. So yeah, those are all the questions. So what do people want to estimate? So I think there's a couple of core business reasons, right? One is they're trying to figure out when are they going to be done? It's a product company. They're trying to look at marketing efforts and when are they going to be done in a particular time window? They're trying to look should we even do this project? Is it economically viable? Do we need to add more people? Is that a question we need to look at? The other reason for estimation and this is the part of estimation that often gets ignored is the fact that we're coming up with value. We're doing estimation on value too. Prioritization is really looking at a value per unit cost. And I think often when we look at it, we say well the product owner is going to do the prioritization. Well they're only looking at one side of it. They're only looking at the valuation side which also has huge uncertainties, probably even larger uncertainties generally than on the cost side. I like to get the team together and look at a combination of value and cost and do some quick analysis of value over cost because that gives us a good perspective on what we should be looking at for prioritization. So it's one of the things, some quick estimation and whether I'm doing that, quick estimation purely on value or including the cost side. In the case where you said almost all your stories are fitting between three and eights, you may not have to worry about the cost side at all. They're sort of all within that, just call everything a five and you'll be fine. In fact that's the other thing I'll tell teams if you really feel like you're wasting time on estimation but because some corporate policy you need to do something, just call everything a two, everything a five. It doesn't really matter, right? That may get you through the system, right? Or maybe insert some randomness in there in order to make them feel good. So when you look at those questions like market window, profitability, budgeting for the size of the team and prioritizing, those are questions that require different kind of answers, right? Absolutely. And sometimes different people too, right? Because you may engage different people than are normally engaged in the estimation process. And I encourage my teams to bring in the right level of people and the right types of people in trying to make sure they understand why they're doing what they're doing, what business problem they're trying to answer by what they're doing and then buy into it so they own it all together. Great. Are you ready for the lightning round? The lightning round? Also, let's go. Alright, so in the lightning round I'll ask a question and you'll answer the question and when you've answered it you can ding the bell. Okay. So do you want to try it? That's good. Brilliant. Alright, so we didn't get time to discuss your last talk but you'll get a chance to explain it quickly. So what are real options? So real options are the right, not the obligation to take some action before an expiration date. Very good. And is estimating an hour's wrong? Not necessarily. It's not evil. How do you fail with a global team? Very easily. If I want to make it fail it's not hard at all. So what would be the first step to fail with a global team? Don't let them have any autonomy and basically make it almost impossible for them to deliver on what they want. It's very easy. No can shut off communication. Just give them piecemeal work. They'll fail almost every time. What's the best way to fail a local team? That's an interesting one. Would it be the same? Probably, but it's just different nuances. Because it's more subtle. With a remote team, shutting off communication is so easy. And usually with a remote team, there's one team that has more strength than the other. So that's an easy way to shut that down. So if you shut down communication, the team will fail, but it's so much easier to shut it down when you're working global. Correct, yes. Because communication, it takes work to make global teams work. You've got to put effort into it. You've got to really treat them as partners and extension and have ownership. So you have to have ownership across the board of the end goal and local teams have to have their own ownership and autonomy in order to be able to be effective and productive. Thank you very much. Awesome.