 Awesome. I'm going to get started here, I guess a minute early, but hopefully you all are here for my presentation on estimation. Right away I want to also give credit to one of my coworkers who couldn't be here today, who you're probably all here to see, Johnny Fox. He's not here. But he has a kid on the way, so he couldn't make it. A little bit about me. I'm John Nolan. I'm the director of operations at Promet Source, which is a Drupal development company out of Chicago. I spent a lot of time trying to pick out which picture I wanted to show you guys, and the one of me in the water describes me the best, because I'm from America, from Chicago, I like boating, and I'm drinking beer. So that picture summarizes everything about me. My history at Promet and other jobs has been as an analyst, a consultant, project manager, and now director of operations. I oversee all of our project managers and all of their development projects. I work a lot on our company's process, our tools, and obviously with the help of my coworker, Johnny, estimation is one of those processes from Chicago, obviously. I wanted to start us off right away with a little bit of details about the presentation, just so you know what you're in for. This is a quote from Steve McConnell, who I'll quote several times throughout this presentation, and I'll give him credit at the end as well for his book that I've based a lot of this off of, grabbed some data from. But the typical software organization is not struggling to improve its estimates from 10% to 5% error, but instead trying to prevent those projects that go 100% over. I don't know if you guys have that experience in your own organizations, but I think most of us aren't trying to find a way to have a 5% margin of error, but to avoid the chaotic, out of control, 1,000 plus hours over projects. So what this presentation is about is about those fundamentals that are often overlooked that lead to some of those overages. We're going to talk about big risk factors. Really it's a presentation around giving you the tools to probably talk about this internally. Maybe you have problems selling your estimate to your salesman or selling your estimate to your client. I'm definitely going to focus more on the internal piece of it. I'll probably throw in some sales points throughout it, but it's really about preventing those huge overages. I'm not going to give any magical formulas. I do have a formula actually in the presentation, but it's not a magical one. We're not going to talk about the fine tuning. I'm not going to tell you exactly how many hours should be your average to build a content type. We're not going to go to that level of detail. I want to start off talking about the big problems that I see, and then I think most of us see in the estimation process. I'm going to move into some of the lessons learned in avoiding that risk, and if we have time, I'm going to do a very, very high level on some of the estimating techniques, which ones I've found to be better over the years. So what is the problem with software estimation? The first point that I want to talk about is the discrepancy between what different people think estimate means. So when a developer says an estimate, or someone in sales says estimate, or your client says estimate, or your executive says estimate, I'm guessing that most of the time everybody is talking about different things. I think this is probably the number one issue that you may experience within your organization when talking about your estimates. And that's that estimates, targets, and commitments are entirely different terms and different concepts. And normally different folks in different roles are talking about different ones of these. So the first one is what we ask developers to do, or maybe we ask the project manager to do, and that's the estimate. That's a prediction of how long something's going to take. How long is it going to take, not what it's going to cost or what you want it to cost. Normally your executive is going to be talking about a target. I need to have this done in three months. The client doesn't have more than $30,000. Those are targets. They're not estimates. And I'm guessing most of you have experienced that communication gap where you go to somebody and say this is going to take five months, and they say are you kidding me? It needs to take three. Your estimate needs to be three. It's not your estimate that changes. It's the target that needs to change. And the last one is probably something that's more around your contract or your sales folks. That's the commitment. So three entirely different things. Normally you're going to look at an estimate, and I think I said the estimate's going to be five months. And somebody from your team comes back and says, no, that's not going to do. We need it in three months. And eventually hopefully what happens is you find a commitment that is in between. You say, well, here's what I can do in three months, or here's what I can do in a thousand hours. And they say, oh, okay, that's not all of it. Can you also add this piece in? And you normally find the middle ground between what the target timeline was, what the estimated timeline or hours was, and you cut something out of that phase. And this is a really hard thing for us to do. We talk about agile. We talk about a lot of different ways to approach this, but this is very key. And hopefully this arms you with the next time maybe you're the estimator, and you go to your boss and say, this is five months, and he says, no, the estimate's three months. You can politely correct that they're talking about a target, something that they want. And you can talk about what you can commit to. The next thing that I've learned along the way is that your best developer normally isn't your best estimator. At Promet, we have Solutions Architects. They're basically our technical team leads. And there's a lot of really good things about your unicorn or your hero or your architect, whatever you call them. They know the most about Drupal. They probably have the most vast experience. They've done big migration projects, integration projects. They can do front end. They can do back end. They understand business value. They understand the concepts of minimum viable product or agile. They can probably debug anything in about three minutes. That's really great. But they're normally missing a couple things. And being a good developer doesn't necessarily mean that you're great at estimating. You might leave out things like, oh, I'm not the one doing the work. That normally makes a huge difference. They're normally not going to count for time about training or onboarding somebody, debugging an issue that comes up. There's a lot of miscellaneous things. And there's a lot of what I call unfounded optimism. So a piece of unfounded optimism is something like, we'll be more efficient and effective on this project than the last one that was the same scope. A lot of things went wrong last project, but they won't this time. We started the project slowly during the learning curve, but now the velocity of our sprints is going to be much faster. Maybe you're asked to estimate something in the middle of a project. Those unfounded optimisms or subject opinions are going to skew your results. Normally no one is actually good at estimating something that's entirely new. But your best estimator is going to be somebody who has access to the most data, the most historical information about what that thing costs. We're going to do a little activity here about the definition of a good estimate. I think the definition of a good estimate is that 90% of the time it's right. I'm just going to make that argument right now. And what I want you guys to all do just really quickly in your head here is think of the temperature of the sun. Give it in Fahrenheit and think of a range, whether you jot it down right now or just think of it in your head, I'll give you a few seconds to think about it. So it's a range, maybe it's from 1 to 100 degrees, I'm telling you that answer's not going to be right. Think of a range and think of being 90% confident that you have it right. Think of your range. Now I want you guys to be honest, I've never done this so I hope it works. Did this number fall in your range? If yes, can you raise your hand? If no, can you raise your hand? So my math tells me that if we all were trying to be 90% confident, 90% of the people in the room would have that answer correct, right? So I'm not the first person to do this. They've done it over many, many different industries, projects, different types of questions. Maybe they're asking developers about the temperature of the sun or maybe they're asking a building contractor about questions. They've done it in lots and lots of different industries. And what they've done is they've asked them 10 questions and said to be 90% confident. And if you think about it there too, if you're 90% confident on 10 questions, you should get 9 answers right. Turns out that they get nowhere near 90% of them right. In fact, the average number of correct answers was 2.8 and only 2% of quiz takers scored an 8 or more correctly. What that means is that most people's idea of a good estimate, an idea of 90% confidence, really means 30%. That's why we triple all of our development estimates. So the next thing is, it kind of sounds like a broad statement. Software is not developed in a vacuum. Basically, I think in estimation we often just think that we have a path right in front of us and nothing's going to get in the way. Not just client factors, but maybe outside factors. And there's a lot of external factors that we don't account for in our estimates. Thinking about the beginning of a project, that contract finally gets signed, you're all excited, and the staff's not ready. And now somebody changed their mind about something. And the project that you just moved the staff over to from now has a big bug, and they go back to it. How many times have you layered two projects right back to back and had to totally skimp on one because you keep on going back to run support efforts in the other one? Even as a company that has a support organization, that still is something that you probably experience. Later on in the project, staff diverted to a trade show. All of your projects are experiencing this right now. I bet it wasn't in the estimate. You know, these, I told you I was going to make some comments about sales, and I'm guessing there's a couple ones on here that you look at and say, you know, that's not the client's fault, or how do I price that? I'm not necessarily saying that these items need to be passed onto the client. It's not the client's fault that your staff is not ready. It's not the client's fault that you decided to go to a trade show and delayed the project. But again, we're not talking about selling. We're talking about estimating. We're talking about planning. And these things need to be planned for in those timelines, whether you get them paid for or not. So this is just more about some of the events that happened during the project nearly always invalidate assumptions. This isn't rocket science. You know, you have functional assumptions, staffing assumptions, priorities, and inputs. And these are probably the four areas that I would recommend looking at at the beginning of a project and saying, you know, what are my assumptions here and what probability is there that these aren't going to go as planned? In our estimating, we have a spreadsheet that we used to do it, and we always write down our assumptions next to every single line item. And that gets carried through all the way through contracting and is reviewed at the beginning of the project. And that's really key to have those assumptions written down and understand how to plan for them. You know, functional assumptions, you know, your third-party data store wasn't what you thought it was. You know, staffing, your architect wasn't available because another project sold first. Your priorities were talking about going back and supporting a project that was just completed. I had a major issue with it. And inputs, whether it's from the client or from your design team or your design partner or it's a server or all sorts of different things that need to be contributed to your project, you know, plans without those buffers aren't plans. They're targets that aren't going to be hit. Software estimation is not just about adding up to some of its parts. I thought a lot about how to talk about this. And I kind of started thinking about economics a little bit. And I think everybody knows what economies of scale is. You know, if you have one person running your HR, you probably have one person, whether you're a five-person shop or a 50-person shop, maybe. And so the cost of that person per capita goes down, economies of scale. There's this weird thing that I think happens in projects that I would, you know, is a diseconomies of scale. And the bigger that a project gets or when a little thing goes wrong, it can greatly amplify in the opposite direction. So let's just think of, for instance, a project that's gone off target. It's off its timeline. The migration was way more difficult than we thought it would be. We're two weeks behind and we're a couple hundred hours in the hole. We need to start thinking about the project dynamics. If we just added up the parts, we wouldn't be accounting for how a project changes over time, especially when difficulty or delays are introduced. And this statement is really saying, is really talking about that amplification. There's certain things that happen in a chaotic project or the project dynamics of your team change when you're delayed. You have all these things start happening. You have more status meetings. You spend more time apologizing to the client. You have to go back to the contract. You have to re-estimate. All these things start amplifying the amount of time that you spend on a project. And that's when you see things that are just a week or two behind, all of a sudden, at the end, you're like, how did we get a month behind? Or how did we get two months behind? That amplification happens. And that's why I say it's not always just adding up the sum of the parts. So those are what I think are some big global misconceptions or problems that we're having with estimation and that we need to plan for. And I wanted to talk about just some very quick metrics around what the results of those are. And I apologize if this is too small for those in the back. The chart on the left is talking about the timelines and projects. This is from a couple different organizations providing feedback on their projects. And you'll see everything is above the pink line, which is the line of it took as many weeks as you had planned. And you see that a couple of them made it on time, but pretty much the vast majority were over time and none were under. None of them were delivered ahead of schedule. On the right, you see the facts that I think most of us know. 20% of projects are on time and on budget. 50% are either over time or over budget, and 30% fail. This is an interesting chart. I took this from Stephen McConnell as well. This is depicting that the cost of underestimation is exponentially worse than the cost of overestimation. And I don't mean to say the cost of your people like, oh, if I overestimate, I'm not going to have any costs. But I think many of you probably know Parkinson's Law is the idea that if a task is going to take eight hours and you give a developer 16, they'll find a way to fill all 16. There's maybe some truth to that, but that's a predictive linear cost or downtime or loss to the whole system, not a loss to you as a company, the loss to the client in hours. Whereas underestimation, like I said, it goes exponentially up very quickly. And this is the main point that I'm getting at here, is that as a software industry, we don't have a neutral estimation problem. So a neutral estimation problem would be I've got 10 projects I overestimated, 10 projects I underestimated, 10 projects in the middle. It all averages out. The data is absurdly clear. We have a very, very big underestimation problem. And again, I don't necessarily mean that we have a selling opportunity here to double all of our estimates, and we're all going to make more money. I'm talking about the estimate, not the sale price, and the planning for how much time it takes. We are drastically underestimating that time over and over and over again. And the even more surprising thing to me, they ran a survey and asked execs if given the opportunity to have a project that was a shorter timeline but had a lot of risk, or their project would be delivered in longer time with very little risk, eight out of the 10 of them took the longer timeline. And in thinking about that, I think one of the reasons for that is that we right away don't present the longer timeline. Whether we know it or not, we're kind of scared to present it. We kind of feel like if we present that, they're just gonna walk away. We don't try to sell before we give in. We give them all of our bargaining chips right away. And I hope that's because we just aren't armed with the conversational pieces to talk about the target versus the estimate versus the commitment. And hopefully maybe you can take some of the ideas that we talked about here today and go and say, hey, this is honestly what it's gonna take. And I know we don't see it now. I know you don't see the fact that so and so might quit or that I'm gonna go to DrupalCon or any of those things but we need to start having that conversation, talking about the difference between our estimate and our target and then find that middle ground. Because obviously they're willing to meet you halfway. So moving into avoiding risk, these are just some, I guess maybe lessons learned or thoughts I've had on estimating and ways to avoid some of the things that we're encountering. This isn't supposed to be to scale that all four of these major risk areas are all 25% of the risk. It's just depiction of kind of what else I wanna talk about. We just talked a lot about this estimation process. Just the way that we go about doing it and not thinking about the items that I just talked about. I think that's a big piece. But we all know project information. I know it's a migration and it has an integration within third party calendaring application and it's got content. You wanna go a lot deeper obviously to eliminate risk. Project chaos, I'm gonna talk about this a little bit and how more about how the amplification of a project that's spiraling out of control, how to hopefully get out of that. Not get out of the project but get out of the spiral. And probably one of the bigger ones as well is omitted activities. Things that we don't think of that take up our time. Consider your project's current position within the cone of uncertainty to help you understand the level of accuracy your estimate has. I don't know how many of you are familiar with the cone of uncertainty. Basically on the left hand side, you have day one of your project. All the way on the right you have the day you deliver it. And this is a line of the best case of at the very beginning, your project you say is $100,000. On day one, it could very well be 400 or it could be 25,000. And as you get further along in the process, you should have a more and more 90% confident range of how long that project should take. So I want a website, $100,000. It could be 25, it could be a half million, who knows. That's maybe just the email that comes in that says I want a website. But then you get to product definition. So I have a Drupal to Drupal migration. I have a third party calendar. I have a blog, I have news articles. You start getting a scope of services. A requirements complete, this is where I say you have the specs. So scope would be I have a Drupal to Drupal migration. Specs would be here's all my content types and we've noticed that it's a one-to-one mapping and we've probably documented the mapping. That would be something that's a little bit more of a spec or requirements complete. You can see you slowly move down the funnel or the cone. User interface design complete. We found that this is so incredibly vital. I can't imagine estimation without wire frames at this point. We used to do it all the time and you still have to, right? You have an RFP that comes in and you respond to it and you obviously haven't completed the wires yet. But if you do have the opportunity to phase your projects and phase the contracting of how you do it, we've found that we have a huge step in the right direction. As soon as we're able to have a wire frame site, it prevents a lot of the uncertainty. And then all the way to your design complete, full documentation, designs, everything. That's basically when you're ready to hand it off to your development team. You'll see you're still not all the way there. And the further along you get in a project, hopefully the closer you feel towards the certainty of your estimates. So this is just thinking, depending on what phase here you need to write your estimate or propose your price tag to a client, think about what kind of variability you probably have in that estimate based on what phase you're at. This is one of my favorite ones, favorite things that we do at Promet that helps eliminate a ton of risk. Leveraging the power of well-defined products. To move your way through the cone of uncertainty. What I mean by that is if you can get an early sale that's significantly smaller than the full contract, you're gonna be able to eliminate a lot of the risk and do a lot of discovery per se. So one of the things that I've been a part of is training workshops. Going in, talking with a client, maybe you're doing Drupal concept training. Fantastic idea, go in and educate your client on the terminology and concepts of Drupal. So all of a sudden you can start architecting the project with them and they understand what you mean when you say taxonomy and content type and view and templates and all the different things that you need to talk about. We found it to be extremely valuable to go and do Drupal concept training. And then once they have the concepts down, you can start breaking down their website and be like, so these are press releases. Let's talk about what fields are required and that language makes sense to them. It's a way that you're getting paid to educate them and in turn get down to more spec details of your project. Another thing that we do in our support division is audits, so we have clients that currently have Drupal sites, they come to us for support and maintenance. We audit their site. It's a small package, it allows us to go in and see what skeletons may be in that closet and educate ourselves on the site. It allows us to say, oh, you just want to make your profile pictures on your user's page is bigger. Well, it turns out your user's profile isn't even made in Drupal. They may not know that and you might have just given them an eight hour estimate for something that actually is gonna be, I need to rebuild your entire website. So, and that happens a lot. I'm sure a lot of you have experienced some crazy sites once you do get your hands into them. The last one is running a discovery. Some people call this a sprint zero, fully specking out a project. Normally you're gonna do some type of story boarding or sketchboards for primitive wire frames or workflow. And on the technical discovery side, discovering any migration path or content types. It normally bootstracts off of the training workshops. We actually normally try to sell those as one package. So these are really good ideas. And I'm guessing most of you guys do these. And it's often sometimes difficult to say, I'm gonna sell you a couple thousand dollar discovery or audit and I'm making no promises on the cost of your project. Normally you have to give them some type of range, but at least this can be a checkpoint in the road there. We talked a little bit about chaos projects earlier. I maybe didn't say chaos, but project dynamics and external factors. I think of the projects that I've fallen into the biggest hole on. And I would definitely define them as chaotic projects. And normally at the end, I look back and we're doing a retrospective of why did this happen? How did we go from the migration took three weeks longer than expected to being three months late? And that's when the bad goes terrible. And I always look back and wish that I'd done something where we wanted to keep the momentum going. And what we really should have done is just called like a really solid timeout and fix some of the root problems. So we have those external factors. We have the project dynamics changing all of a sudden the CEO has been notified and now you're asked to do a re-estimation and tell them what it's gonna be do now. That's when all hell breaks loose and you mix the M&M's and the Skittles. But I think we start spiraling here and then we do the same thing. More external factors end up happening. The employees, the developers start getting frustrated with the project. Productivity and efficiency goes down. It falls further into the hole. You start having more and more status meetings. You start having status meetings internally with the client. You start going back to the contract. You start taking shortcuts on your code quality and it just keeps on going. Then it's further behind. Your CEO is asking for another re-estimation. Everybody wants to keep the velocity of their project moving. But if I think back to some of the hundreds of hours of holes that I've gotten in in the past due to mistakes I would have gladly taken three days of eight hour days of a developer sitting on their hands then had that 300 hour loss at the end of the project. And I don't have a magic solution but taking that time out and trying to identify what the real problem is, is extremely important. You're normally gonna go back to trying to re-stabilize something or you'll look at those different assumptions that you had originally made, figure out what's changed and reorganize yourself. I'd much rather do that than just say throw another developer on and we'll catch back up. This one is probably can sometimes be a source subject internally with an organization. Sometimes people forget to add certain activities I think into an estimate and that's just, it happens. But I'm sure all of you have been in a position where you have included all those activities and certain things get slashed out of your estimate before it goes out for contracting to the client. And what I think is that there's a big difference again between what the estimate is and what you're actually selling, what your price tag is. And so you should make sure and hopefully have that conversation with your sales team or whoever it is that may come in and slash and cut down your estimate. Let them know that that estimate is still the estimate. It's still how long it's going to take me. And just because you discounted 100 hours doesn't mean it's going to take 100 hours less. I just listed out a lot of activities here that I've seen cut or omitted from projects that really do end up taking a lot of time. For those architects out there, I know there's a lot of activities that our organization has forgotten in the past. We think that our architects or our technical team leads are part of the development team and they're going to be writing code or doing site building. And in turn what they're really actually doing is they're reviewing everybody's code. They're doing the pushes. They're on the phone with the project manager and the client. They're providing updates. They're removing blockers. There's all these things that they do that are just completely omitted from our estimates. There's non-software things in here. Management items, stand-up meetings. You have a team of six that meets for 30 minutes, five days a week. That ends up being a lot of hours over five months. Again, I don't think these items are necessarily items that you always have to sell in your contract. You might not put a price tag to them and your client pays for them, but they should be in your estimate so that you can project timeline and hours that you're gonna need for staffing and everything. My last comment just kind of loops back to the 90% confidence. I would never suggest going in and cutting a development item's estimate down as we proved earlier. Normally, those estimates are already too optimistic and it would be great if I was wrong. One of my last items here is not over-complicating your estimate. And I have these two graphs here and I don't really know how to best describe this, but they pooled a group of teams, a bunch of different development teams, and asked them to estimate the same thing. And for one team, they gave them many factors, many options of how to estimate it. So I'm gonna use a very, probably kind of dumb example, but imagine if you could identify 10 different types of content types, like highly complex, medium complex, maybe some range like that, or maybe based on what types of fields you have, or somehow you're able to say that a content type could either be 52 hours or one hour, and you have 10 different options there. And you ask somebody to estimate these other, this website that way. If you give people that many control knobs to be able to choose which one each one is, as opposed to on the graph of the right here, they just said one content type takes on average this long, covering all different types. So there's less control knobs. I probably did a terrible job of explaining that. But basically the idea is the more factors that you put into your estimate, the more different options of how you estimate it, or how you can tweak that number, just introduces another touch point, whereas humans, we don't understand 90% confidence. We have more touch points to insert uncertainty, and optimism, and biases. And so this study found that the less control knobs, the less factors that you gave people to estimate something, their ranges were way more narrow and more accurate to the end result. I mentioned that I wanted to talk a little bit about estimation techniques to close up for we have questions. And this is gonna be relatively high level. I assume that most folks here have some process that they use for actually doing the numeric estimates. And so this is just a little thought provoking thing about maybe how you tweak your own model. Everybody's kind of got their own, maybe spreadsheet or model or something that they use. This maybe is just a little thought provoking to challenge areas that you can improve your template. So we're immediately gonna assume that everybody in this room is an expert estimator and that all of those problems that we just talked about are no longer problems. And we're gonna guess how many people are in the room. And my numbers that I put here aren't actually real, so not specifically this room, a hypothetical room. And there's a couple different approaches that we could probably do to do this. And I would call on you guys and each of you would probably have a different way, but I think I would be able to sum people up into a couple different categories. The first one is gonna be the hobbyist. This guy predicts how many people are in a room for a living. This is hobby. This is maybe executive, maybe it's me, maybe it's a project me. This is somebody that walks into the room and just says, ah, that looks like a $75,000 project. Hobbyist. The next guy's the factor guy. They do some form of counting. So 15 rows, 10 people in a row, 15 rows, five people in a row. They do some type of quick computation and get a number. We have the percentages guy, so the capacity of this room is some number and I think it's 50% full. And then we have the data guy. The last three sessions in this room had this many people. I'm guessing if I average it, it's gonna be about the same again. Going on historical data. And so I kind of think about this in four different types of estimators. There's the judgment guy. There's the counter. There's the computer and then there's somebody who's calibrating. Calibrating is basically taking historical data and manipulating it based on the new task at hand. So in my opinion, counting and calibrating is by far more accurate than computing and judgment. So you'd find something that you can count and then you calibrate it with a historical piece of data. I count 10 content types and on average it takes me 12 hours to do a content type. That is a count and calibrate method. The next one is compute. Compute's kind of based on percentages. So instead of counting how many meetings I know my project manager's gonna have on this project and every interaction of five minutes that he's gonna have with the developer and every report that they need to fill out. Instead, I'm guessing what most of you guys do is assign a percentage of the total project to a project manager. That would be a form of computing and judgment. Judgment is normally where you place your risk factors. Maybe it's just at the beginning of the project, you're like, it's a $75,000 project, but normally this is where you're picking your level of risk, your multipliers of I'm not very confident in this line item. I'm gonna assign a factor to it to multiply it either up or down, accordingly. So the saying is count if at all possible. Compute when you can't count and use judgment alone only as a last resort. We'll talk more about judgment a little bit. And I just, for maybe to clarify it a little bit more, wrote down some of the things that I think we on a regular basis count and calibrate versus things that we compute and items that we judge. Sorry if this is small, I will post these slides later. And so I wanted to walk through just a couple of quick examples. And again, hopefully this challenges you guys to think about how you update your estimating spreadsheets to eliminate as much judgment as you can. So this is what I talked about before with counting and calibrating. I have eight content types, four hours. This is not real. Don't go back to your developers and say, a content type takes four hours. I'll get lots of hate mail. That number is wrong. But I know how to do eight by four, so that's why I put it there. So it counts something. It's gotta be something that is meaningful to the size of the project. I think back in the day, it would be lines of code. Obviously with Drupal, you wouldn't count lines of code. Count other factors, like I mentioned. And then the calibrate part is really interesting and I challenge you guys, if you don't have this already, get yourself a good ticketing system. Organize your time tracking in such a way that you can get historical data around things that you count so that you can calibrate those numbers. We use JIRA at Promat and Tempo Time Sheets, which is attached to JIRA to do that. And we assign our estimates and our time sheets all to common line items so that we can go back and evaluate the numbers and find out what averages are. An example of computation, I just talked about this, assuming that it was a 1200 development hours were estimated, however you estimated those, you apply some percentage factor to that. That too can be a calibration if you are pulling it off of historical data of how much project management or QA or other items that you would apply a percentage to. If any of you figure out a way to be able to count all the different PM activities and estimate it with a count and calibrate method, I'd be thoroughly impressed, so I just think there's so many that that's a little bit harder to do and that's why we normally do a computation. And my last item here is talking about judgment. Up here I made up a risk multiplier. So it's a site building definition, so this is where the judgment comes in. I'm thinking about this project ABC and I'm just about ready to put down the estimate. Do I have really good definition of the project's site building specs? And if it's very low, 2.0 means I'm basically gonna double what my estimate would be. Maybe it's extremely high. I know every single field, every single field type, every single, everything is written out and I'm literally gonna hand that over to somebody and not have to talk to them till they're done. I might reduce what my normal or nominal estimate is on that. And then influence is, it's magnitude on the project. So where site build definition might be 1.1, maybe migration has a huge influence on your projects because you know that influence is gonna magnify very quickly. So, and I didn't just make this up myself. This is based on this other factorial chart. It's, I hope I'm saying it right, the Kokomo 2. Has anybody heard of this before? And so it's 17 effort multipliers and five scaling factors. This is relatively old, I would say, but it is something that you could take a look at. It's factors may not apply to projects that you do, so I challenge everybody. If you are gonna use risk multipliers in your estimates, maybe go pull your own historical data and start making these factors and multipliers for yourself and normally we would apply those at the end of a project kind of to the subtotals. But judgment is the difficult thing and we always have to do it. Like I said, I think RFPs are a great thing where you see this commonly because you haven't had an opportunity to have a product like a discovery or an audit or something to start off to clear up the scope of services. Normally it may just be one page, maybe it's 300 pages but not that spec level detail that's gonna help you estimate accurately. One of the things that, I think these are the, I have two formulas here that aren't magical, but they may help you get better at estimating. The magnitude of relative error, actual minus estimated over actual. This is really cool to do at the end of your projects or maybe take like your top 10 projects and do this and see what your average is. So this is an example here of 10 relatively large projects and the MRE on them and I averaged the MRE. And what this is telling me is for these 10 large projects the estimation was off by 34%. Now you could just add 34%, keep on estimating the way you're doing and just add 34% at the very end and then your MRE will be zero in no time. But hopefully you can do this and maybe refresh it on a regular basis as you change your estimation approach and your process and it'll help you see if you're truly getting better or not. Expected case, this is actually, when I read about this I thought it was really cool and we actually did it on one of our projects. We had one team that I would say normally has the lowest estimates. They always look at it from nothing's gonna go wrong. Everybody knows that different teams take different approaches to their estimates and normally you modify their estimates accordingly sometimes. But so I took the best case and then I asked another team who I knew was gonna include everything. They're gonna give me the estimate that has all the factors, all the extra time. They know about being very realistic in their estimate and then I took another one that was in the middle and I actually did this and it was something like the best case was 500, the worst case was 1,000 and the most likely case was like 720. And I ran the formula and I think it ended up coming out to like 733 and we actually just finished the project and it took 742, I was absolutely amazed. And I don't think it'll be that close every time but it was actually a pretty interesting exercise, something that I think I'm gonna play around with a little bit more and see how it goes. If that helps kind of scale or give you a little higher level of confidence in what your range is. So that's kind of it. I do wanna give credit to this book. This book is amazing and if you haven't read it, go read it. You'll see that I pulled many things from my presentation from it. It's been an extremely educational thing for me to read. I've read several of Steve McConnell's books from project management and other areas of work that he's done. His first half of the book is kind of some of the stuff that we talked about today. The second half of his book goes into a lot more details on estimation techniques and it gets very technical on how to get crazy detailed in your estimates. The second half of that book is really about if you are an organization that has gotten your MRE down to 10% how you get it down to 5% or 3%. I didn't cover that because we're not there yet. So thank you and I'll take any questions if you guys have anything. There's a microphone so that it gets recorded I think. Just to put some of this in perspective, I'm guessing what you're gonna say is that you do your project's fixed bid? No, not all of them. So, time and materials? We do a couple different types. I mean, we have package services which is a form of fixed bid. We do time and materials projects as well. And then we do fixed budgets so more of an agile sell. But again, this stuff I more focus on as planning tools so it doesn't necessarily mean that, it doesn't affect necessarily how you sell something but having the number in mind of exactly what it's gonna take so that you can plan how many team members you need to put on it, when they're gonna be on it. I think having a good estimate is more valuable internally for planning than it necessarily is for selling. All right, thanks. Hi, my team has heard me say this a lot. I always talk about the definition of it and the definition of done. So I always talk about like, okay, when you say it's gonna be done and you're gonna do authentication, did you just do the API or did you do the front end? What was the it? You said it was done. And the definition of done is all about like, was it just you coding? Did you merge the code? Did you QA it? Did you release it? I don't know if you have any comments on how that plays into estimation. Yeah, I think that falls under a handful of the omitted activities. We always think about how much it takes just to build it and there's always a workflow and deployment process or code review process. Those different things that you infuse into your projects to ensure your quality and actually get it to where it needs to go. I think those are some commonly omitted activities that when you ask a developer to do an estimate, they're not necessarily thinking of. Right, yeah. I'm assuming you have worked with the function points before, some people hate it, some people love it. What's your experience? Function points. I haven't. You haven't? What's function points? When you categorize everything that you have to do, for instance, a page will take five function points or. Oh, like story points? Story points, function points, or yeah. Yeah, so we use story points. We don't necessarily run all of our projects to the book agile. We more use kind of the ceremonies of agile to facilitate a project. The goal is to increase communication as I think why most people go agile. They still realize they have to deliver something to the client in a certain amount. For story points, we've actually taken a relatively, probably maybe naive or simplistic approach to it that we assign story points like it's an hour of time. And we look at the velocity that a project starts moving to adjust it as it moves forward. Right, thank you. In relation to the chaos project stuff, are you a ghost and were you in our office last week? No, but I want to hear more. Yeah, moving on. Do you have a set price for the workshops and the audits? So do you guys, when you propose those, do you just have a set price for those and just say this is how much those cost? Our audit is very productized. I use the term productized very loosely. Our audit is productized. We have like two or three different sizes and those are very specific pricing. For training, that one can be a little bit more flexible. We do have two different sizes that we normally typically look at. However, it also, training workshops depend on how many people and how many sessions you're doing. So sometimes we have three people in a room. Other times we have 15 and that plays a role in it. And then discovery, we actually have a template that we've put together and been trying to modify and improve that helps you price a discovery based on some formulas. So it's like a calculator. Like what items you're gonna be doing in the discovery? Are you doing, are you creating personas? Or are you not? Are you doing wireframes? Are you not? Are you, do you have a migration? Is it Drupal to Drupal? Then it's gonna be this. Is it not Drupal to Drupal? Then it's gonna be more. Integrations, how many integrations? Like we have a formula template to calculate our discovery pricing. Are you doing your user stories during the discovery? Yes. Okay, thanks. Is that it? Thank you guys very much for coming.