 Jeff Patton here. Just to give a little bit of introduction about Jeff Patton, I'm sure most of you have heard about Jeff Patton. Saw him last night on stage. Jeff has been one of my gurus. He's been a very big influencer for me. In fact, the most embarrassing moment for me was in 2007, sharing a stage with him while receiving the past award. I keep haunting him for that. But Jeff is a wonderful guy and I'm sure you'll enjoy his keynote. So over to you, Jeff. Thank you. If you ever hear me say this, if you ever hear any consultant say this, first off, you know not to believe them. And the older I get, the longer I do this, the more I realize, well, the more I realize the sheer volume of what I don't know. This is a talk about, gosh, it's not squarely about agile. I'm not going to talk about agile. I'm not going to talk about scrum. I'm not going to talk about any of this stuff. I'm going to talk about software development and how, well, I'll start with how I've been fooling myself from the very beginning, from the very outset of my career in software development. I fool myself as a profession. And I want to talk about how I broke myself of the habit. And then over the past, especially over the past 10 years, how I'm seeing organizations start to break themselves of the habit. Look, we spend a lot of time fooling ourselves. We spend a lot of time pretending like we were successful. And for as much as we use expressions like fail fast, and part of the reason we're iterating is to learn, look, no one wants to fail. No one wants to be wrong. This is one of our biggest problems to fight, or one of the biggest cultural issues I've got to fight. There's a video there that's worth looking up, especially if you've got kids. Make your kids watch it. Watch it as a parent. But this Derek Sivers video, the why you need to fail. All right, let's tear in. I need to talk about me. And I want to rewind all the way back to, oh, it would have been about early 90s and my first job in software development. And I remember how being given the job, I wrote a piece of software for online aircraft parts ordering. This is for people who fix aircraft to be able to go online and place orders. And this is in the early 90s when they would have done this normally through call centers. And doing this online would have been kind of a big deal. And I remember building a piece of software I was really proud of. And, well, I remember thinking the code was super good. And I wasn't going to have any trouble with that. And I remember telling this guy, this is my friend Bill, the now I've known for how long have I known Bill? This is the early 90s, 25 years, something like that. Bill was in charge of testing, in charge of QA at the company I worked for. And I can remember Bill saying, look, Jeff, I need to get your code. We need to get your code. We need to start testing this code because you've got to go live with this stuff. And I said, Bill, this code is so good, you're not going to find any trouble with it. So I'll get it to you when I get it to you. Look, I got Bill the code a week before delivery, so that should have been plenty of time. But Bill, in just a few minutes, starts breaking it, starts finding lots of breaks. And that's my first instance of really realizing, my God, I can't believe that I believe my code was so good. I had thoroughly fooled myself. It's over the course of the last 20 plus years, I've realized it takes a different kind of head. It takes a different kind of thinking to really look at and break software well and that testing is a strong profession. This is a photograph from a recent team that I'm working with and we pulled this very small team together for a workshop and this is a very small cross functional team and sitting next to each other, our tester and developer. These people are working extremely tightly together. They're working so fast that they don't have time to move tasks around on a board. In fact, the sticky notes sticking on the back of their computers show what they're working on right now. And when the developer on the right is done with something, he will take the sticky from the back of his computer and move it to the person who's testing on her computer. And she's testing things and he can lean over and look and see what she's doing and she'll say look at this. This is a lot faster than logging bugs. And what I see is this rhythm of these people having a conversation about the code and it's no longer about logging bugs. It's about her looking at the code from a different perspective and making his code better. And I've seen over the course of years, developers learn how to build a strong partnership with tests. So look, my lesson learned that testing well is a critical discipline. It's a skill that you can work a lifetime to be good at. My friend Bill is still in charge of testing. In charge of QA, the company I was at in 91. And it's a lot bigger company now and it's great. But look, it takes complementary skills to build software. It isn't just one good developer. But look, one of the things I could fall back on when I started in my job was I'm a white hot UI designer. I can design a really great UI and it's because I'm an art school dropout. Now look, if you're an art school dropout and you become a software developer, basically the UI design that I designed sucked less than every other software developer. So as a front-end developer, I was the man. I was great. Everybody said, wow, you've got to put your magic on this UI. It's going to be really fabulous. And I managed to fool myself into believing it really was fabulous. And everybody loved it, at least until we shipped. This is when I learned another really hard lesson. Now, first off, is anybody here in the room ever set with a customer or user as they've used a product you've built or you've worked on before? See some hands. Now, if you think back, that's a lot of people, that's great. If you think back to the very first time you did that, how did that go? So I'm hearing some words that what I listen for is kind of the nervous angst build up. Did anybody have a great time doing that the first time? Anybody? See, a couple. Look, I found that almost all the time, especially when I started, that it never went the way I expected. The users didn't love this stuff as much as we loved it as much as I loved it. And we were just wrong. You come out of the experience with one of two reactions. You'll either say those stupid users, they don't know what they want. Or I can do something about this. I can make this better. I learned very early on that UI design isn't about making things look good. It's a simple model that I draw on napkins a lot. So this is the model drawn on a napkin. But let me explain this really quick. I think of UI or user experience as having these three different layers. If I'm building software for someone and I find a problem, something that's difficult for them to do, it's a challenge. And I identify a feature that will make things better. Make it easier for them to do things. That's this raw utility layer. A feature that helps a user accomplish something they couldn't before. That's giving them raw utility and that's going to make them happy. The next layer above that I'm going to refer to is usability. That's how easy it is to learn, how quick it is to use, how efficient it is to use, how well we remember at the next time we go to use it. And if I can ratchet up the usability, that makes things better. And this last thing, now that's actually the aesthetics. That's the way it looks. That's how consistent it is with my brand. And because the way something looks inspires confidence, things that look good actually are easier to use and learn. There's an old expression, you can't judge a book by its cover. That's an expression because people judge books by their covers. And, well, people will judge the quality of the inside of the product based upon what they see on the outside. So this isn't just a simple thing. Making things look good turns out to be pretty important. Now, if I look at this raw utility thing, this is an example of a website in the US called Craigslist. Is there an equivalent in India to this? It's a pretty bland, pretty boring website and it's an example of squarely hitting utility. Not beautiful to look at and by all accounts, not horribly easy to use but simple enough. But that squarely hits this usability thing. Now, every once in a while I'll find, it's hard to find products that get the usability and the aesthetics but don't get the utility right. Because they hit the market and they die. If it doesn't solve a problem, how does that product survive? Anybody think of a product that doesn't have real utility? Microsoft Bob. Well, it died. Look, the poster child for products that don't have much utility is that, the segue. That's a product that's been on the market for a long time. Nobody quite knows what to do it. If I came up to you and said, look, I've got this idea for a thing that allows you to move around without walking, not very fast and you can't carry stuff and you'll look really stupid doing it and it only costs about the equivalent of three or four thousand dollars US. Would you jump right on that and say, well, that sounds terrific. They've been struggling and looking for uses for this. The product won't go away because there's enough investment behind it about the only use we start to see here, well, mall cops and things like that. Look, those guys don't look scary to me. They make me want to steal something because they couldn't catch me not in those things. Now, the last part and the part that we often look at with UI is this aesthetics part and oftentimes you pull in UI people late just to make it look good because we're not thinking about this raw utility stuff at the bottom. Now, people know the expression for just making it look pretty or the expression lip sticking the pig is the common expression for that. So, look, I learned very on that a lot of the UI work that I was doing was lip sticking the pig. We were making bad choices about what to build and that was the real problem. It was getting that utility stuff right. So, look, we decided, look, we need better requirements and if I'm going to get better at this stuff, if I'm going to get better building software, we need to, well, users need to be better at telling us what they want. Anybody who's ever been in a talk I've given, I redraw this same model over and over and I think I'll keep beating this model until the language is pretty common. In software development, you aren't here to build software and that's one of our biggest false assumptions is where we're good at this stuff when we can build a lot of software fast. In fact, that's not the reason you're doing this. You're in software development to change the world. Now, hold on a second, good. All right, for people who will be in a workshop, I'll teach later this week on Sunday and Monday, I do a lot of hand drawing and my slides have gone so out of date it takes me a while to do it. So, look, this is just the hand drawing of this. If your job is to change the world and actually I'm still not getting it well, in a keynote talk, never change your slides while you're doing it. Also fooling myself right now that I was ready. This is what I wanted. Look, this model starts like this. You start by looking at the world as it is now and in particular you start by looking at the people that use your product or the people you wish to use your product or just people that you think you can help with the product and you'll find people that are unhappy or mad or confused and it's looking at these people and the way that they work that gives you ideas. Now, those ideas could be for whole new products, they could be for features, they could be for enhancements and how many people have heard me give this same spiel before? Anybody heard this before? I'm glad this was news to a few, at least this discussion or this framing is new to a few people. Look, it's all these things that we eventually start referring to as our requirements. Now, I'm going to come back to this word because if we need better requirements, one of the challenges here, one of the things we have to remember is the requirements are just another word for our great ideas we believe will solve people's problems. Now, it doesn't matter what process we use, whether it's an agile process or a waterfall process, we go through some process and in the end we build something. Something comes out and it comes out into the future world and what we go through is that those people that we looked at before, they get this product and they're happy because people are different, some people are less happy and some people there's no pleasing. Now, it's everything between that idea and the delivery that we fixate on a lot. That's the stuff, well, that's the output of this process. We know we're talking about output when we fret about time and schedule and we know we're talking about output when we talk about velocity and we fixate on how fast we're building things and increasing our velocity or having a stable velocity but oddly that's not what matters. What matters is what happens when things come out and the term I want to use for that, well it's aptly named, it's outcome because it's what happens when things come out and we measure outcome well not in terms of how many features we ship. If you think of a product that you really love, some product you would tell somebody about and think about what you would tell them about this product you like I'm pretty confident that you won't tell them I love it because it was on time. What we're looking for is what that product allows you to do differently than you did before or the way we measure outcome is in changes in behavior like buying, adopting and using your product if you build a product that has a specific utility lets people do something, do they actually use it? When we're talking about requirements often times we use the term capability too but look if we build a bunch of capability too and ship them and no one actually does any of that stuff that doesn't matter. I'm not interested in capability too, we're interested in actually do's and we measure outcome in terms of do they actually do those things? We get benefit or they get benefit when they actually use the software to do what they're doing. Now, when I draw the model before it looks like it's all about people but the model actually starts back stream all the way up, it starts with our business and it starts by us looking inside of our organization it starts with us asking where do we want to be as an organization what is our strategy and well maybe things aren't so good here we need to improve and that should cause us to focus on specific customers and users and one user may buy a product and use a product and that's interesting but if hundreds do, thousands do, hundreds of thousands do that becomes really really interesting and it's the consequence of all that behavior change that really creates impact. Those are the three words I want people to understand. We fixate a lot on output but it's the outcome and impact that we're really shooting for and in fact your job is not to build a lot more stuff faster. We've got this big assumption that everything we build will generate outcome and impact and in fact it does but not always positive. In fact one of the big problems with software development is there are always a lot more ideas than we ever have time and money to build. Your job in software development is to build less it's to minimize that output at the same time we're maximizing that outcome and impact. I want to go back to this word requirement for a minute. I worked with an organization in the early 90s that organization where Bill worked for and I built software primarily for retailers and over time mostly brick and mortar retailers large chain stores, organizations that had at least 100 stores into the thousands and I had lots of customers telling me they wanted lots of things and I learned very early on as a product manager for the organization that if I did what anyone customer said others would be unhappy I learned that those customers didn't have my business as interest at heart and I we didn't have requirements so much as we were making decisions the best decisions we could about what to build. As our company grew I started there there were 30 or so people as we grew to a few hundred people and we opened up a development office in India and we were growing stronger we got more traditional software people in and I can remember a lady coming to me one day saying Jeff there are some things we need you to add to the product you're responsible for and I said great no problem tell me a little bit about what the features are and who they're for and how they help and she looked at me and said well they're requirements and I said I get it tell me what customers really need them and who's using them and how this helps them and she looked at me like I was stupid and said they're requirements that's when I learned that this word means shut up talking about just the idea just the feature just the details it turns out as a problem it doesn't afford us the opportunity to step back and minimize output to step back and really understand the problems we're solving and look this is a guy named Kent back my earliest agile project Kent was the coach that was hired to help our company adopt extreme programming now he first described this idea of stories in this book called extreme programming explain now I mentioned yesterday in answering the question about stories and user stories somebody had asked that yeah Kent originally called them stories and not user stories and in defining what a story was he said look software development has been steered wrong by the word requirement defined in the dictionary as something mandatory or obligatory the word carries a connotation of absolutism and permanence inhibitors to embracing change and the word requirement is just plain wrong what Kent was well the behavior change that we were looking for with stories and with agile development is well they're called stories because of the way we use them we're supposed to be talking with each other and as a reminder for what to talk about this common template emerge for writing stories this as a particular user I want this feature this idea this thing this capability so that I can turn that frown upside down so that I get this benefit it's a simple forcing mechanism it's a conversation starter so that we start the conversation by talking about well what we're building and why now over you may know this about requirements and I learned this early on that you can deliver a fraction of what's required and people will be thrilled for all of what's required and people will be unhappy and I learned here's a simple table to kind of frame this frame this that look if I'm focused on on time getting things done on time and a positive outcome look if I get everything they asked for done on time and the outcome is good people use it and people are happy look I'm great if I'm not on time and the outcome is bad well I suck and I think we've all seen that but there's the other weird things where if it's not on time but the outcome is good I still am good I could be golden and still be late and even weirder if I'm on time and deliver everything they asked for and the outcome is bad well I suck but they just won't tell me to my face they just won't hire me again so look my lesson learned early on is that it does not matter what the requirements are if in the end if the outcome is bad I lose and look I need to be better at focusing on that outcome that's what we should be doing with agile kind of thinking now for me then in my growth I said look if I'm going to better understand users what they're doing I need better research well one of the pitfalls that I fell in was to try and get data and I learned very early on that it isn't data so much that matters now I'm going to frame this with not my stories but other people's stories this is a very old picture this is a guy that I you can tell it's an old picture for the younger kids that's what monitors used to look like if you remember that was a big one it was really good these people are pair programming this is my friend Andrew and Silage I work closely with they're on my team for people that are this picture is a 2001 picture for people that are that's one of the very first story maps that I built on the wall back there just a line of cards that tells a story nothing to it this is us, this is our team room in 2001 I've left this company long since but I still know them still work with them a little bit and we built software for large brick and mortar retailers now one of our customers in the US is a company called LL Bean I don't know if people have heard of LL Bean as a clothing manufacturer before nice premium clothing we built software that they use in their brick and mortar retail stores some of the stuff that back ends what they're doing in the back but LL Bean builds really good products in fact posted on the wall there is a message from the founder that look I don't consider a sale complete until goods are worn out and the customer is satisfied now working with LL Bean we've gotten this complaint that the return part of our software was difficult it was challenging to use and it took a while we went through the data we found that well the number of returns they do is a fraction of the number of sales so it's not very big and the time it takes to process a return well it's only just a little bit more than the time it takes to process a sale because we've got to gather extra information and using data we told LL Bean that this really isn't that big of a problem and we're not going to prioritize it very high to make this change with respect to other things now they complained a little bit loudly and their concern of ours so this is where I learned very early on that I had a good CEO we get in airplanes and we fly to where they were this is where LL Bean processes returns and it's during business hours that this place is really really crowded and what we learned is that LL Bean does not do returns at the regular checkout line with everybody else they do returns all in one place they want to talk to you they want to hear what's going on they want to talk about the thing you're returning and when we look in the back room these are bins full of things that they've tagged and bagged things that they return and there's a lot of them at these big LL Bean stores there's a ton so what's interesting is that's still that's Eric my CEO by going out and he's the one that talks directly to the customers and we're there too the lesson learned for us is that returns from the data may not seem like such a big deal but when you're standing out there with them working with them and seeing these people that do nothing but returns all day long little annoying things in the software pile up and it really sucks it's out of being there and seeing them work that we build empathy for the way they work so the lesson for me very early on is you don't get empathy from data if you're a product owner I promise you you will prioritize a backlog different if you've looked the person who has the problem in the face and that's going to change things now lean is a term that gets used a lot and out of lean thinking comes a lot of basic principles and one of the terms that gets used by lean thinkers is GEMBA in Japanese that translates loosely to the scene of the crime or where the work happens and GEMBA when we're talking about figuring out process problems is where developers work and where the teams work or where your workers are doing their jobs but when we're talking about products GEMBA is where they work now very early on I remember in agile development the idea was customers as close to us as possible but let me talk about why that doesn't work so well look this is Jane Goodall and I need you all to imagine your Jane Goodall for a minute now if people who don't know Jane Goodall is famous for studying primates in particular apes and chimpanzees and things like that and look someone who is an agile person might say look Jane this is horribly ineffective you get on airplanes you go out to Africa you spend weeks out there you gather notes and bad conditions and we've got an agile approach that will really speed things up for you it's called chimpanzee on site will bring the chimpanzees to you and then if you have questions about their behavior you can just turn and ask them and things will go so much faster for you you'll be horribly efficient now first off you probably know that if you're going to really understand chimpanzees that's not a good way to do it and if you're going to really understand people you're building software for it turns out that's not a good way to do it either I look for excuses to use that old Woody Allen quote that 90% of life is just showing up and when it comes to the way we work with our customers and users it's just showing up when I look back through pictures I've got lots of old pictures of watching people work in back offices in boring surroundings this guy is this guy is a stock portfolio manager and working for this company this fairly large company that everybody would know the name if I said it I'm not going to say it but working with this company we're building this portfolio management system we're talking with folks and they say look we just want to get a feel for how the portfolio managers work can we talk to them and they said they're very busy people they make an awful lot of money they're responsible for a lot of our revenue we can let you talk to them but we can set up an appointment maybe a week or two from now well okay where do they work could we just go down to where they're working and see where they work and they said well they're down on the third floor you know it's a very secure environment and you can't go down there because you disturb them and I said well we just want to go down and take a few pictures that we can understand what their work surroundings are like and other tools they use and they said well you definitely can't do that because it's a lot of legal ramifications and things like that so no you definitely can't and we said well okay they left we went over to the elevator we hit the third floor we went down to the third floor again and we watched from behind glass this big open trading room where a lot of portfolio managers are working somebody walks up and says oh you guys trying to get in and we said well yeah and they said well come on and they lead us in and so we stand in the back and we're watching these guys work we just snap a few pictures no one seems bugged by that this guy turns and says what are you guys doing here and we said well we're working on the new portfolio management system see how you guys work and he says come on over here let me show you how this works because this is really complicated and he sits with us and he explains things other portfolio managers come they talk to us and we learn an awful lot in just a morning of watching these work and sitting alongside them and in the end he says wow this is really great no one ever comes down here and and we show the pictures back to the organization and they say wow this is really great why doesn't anybody ever do this these are a different kind of portfolio manager working at the same organization and they work in teams of four one of the items on the backlog because they work in teams was a way for them to chat with each other using chat windows and things like that but when you watch these guys work they've arranged their desk so they could actually chat really chat they talk to each other all day long and that feature seems pretty stupid in a context and this particular morning only things to note here is that if you look at this window the window out there is really dark because I got up at four freaking a.m. in the morning to get to their office in San Francisco because they need to be up and run lots of algorithms to figure out their trading strategy before the market opens in the U.S. so we look to watch people work you've got to be there when they're working and it's not always during business hours or at least during my business hours and on this particular day the trading algorithms had failed and if we ask them how their process goes and in fact we had not once do they mention some days the trading algorithm blows and we spend a lot of time trying to diagnose what went wrong and figure out a manual trading strategy it's being there that lets you see things like this show up I've got lots of pictures like this and including last year working with teams at Kodak Kodak asked me to come out to Mumbai to work with some teams and I immediately thought oh my gosh they've outsourced this work and I can't make points to them about understanding and talking with their computers if we're dealing with outsourced development and happily for Kodak that wasn't the case at all the guy on the right is a product manager and the guy on the left designs printers and we're talking with that guy in the middle he's not a software developer he takes pictures of people for the 30 rupees he will take your picture or your family's picture and then he prints it out and we spend the day in front of that hotel and at the monument that's kind of out in the water out there we watch these guys work and at the end of the day these guys are rock stars no one ever comes out and visits them and these guys walk away with a strong understanding for what it's like to be a user of the printer they're trying to design a better version of this guy is a person named Atik he works at a U.S. company called Edmunds.com Edmunds is an automotive website it's where I go to get compare car information and service records on used cars and other information on new cars and they help car buyers but look they've long since started working with their users at least back when I started working with them and one of our first things that we did was to pull in a lot of people for them to talk to the person sitting across from Atik is not a developer she bought a car recently and he's talking with her it turns out Atik runs the analytics group inside of Edmunds and at the end of the day at our reflection he said look I've always been confident I can tell you precisely what users do on our site but it's not until today that I realize that I could never tell you why it's looking people in the face and asking them more questions that you can figure out why they do what they do it unravels a lot of puzzles a lesson learned for me is data is an empathy and it takes the pairing of those two things to really make useful decisions and this is why you hear so often in agile development that if you're going to do this well you need a good product owner there are the ones who will do this they'll figure this stuff out and then they will tell you what the right thing to do is and I thought as a product owner that's what I'd need to do and I could go out and figure this stuff out and tell people what they should build but well it doesn't actually work that way either this is a model I'll often draw to talk about where the sweet spot is the kind of product we're looking for this model comes from a guy named Marty Kagan Marty Kagan's among other things he started at eBay as a third product manager when eBay was a startup or was a very small company and when he left eBay in the late 2000s was in charge of 60 product managers and all of user experience and led product there if you were a product manager at eBay he would tell you your job is to find a product that is this intersection of what's valuable to your organization to us, to eBay, to our organization valuable to customers and users usable by them and feasible to build in the time and tools we've got look any idiot can come up with fabulous feature ideas that people can't figure out how to use well they need to work with people that can figure out how to make them usable and look any idiot can come up with feature ideas that are way too expensive to build given the time and tools we've got to build car or home I promise you I can come up with lots of things I can't afford challenge is finding that sweet spot the thing that we can't afford and as a consequence if you're a good product manager doing this stuff well you focus primarily on value but you find people that understand users and understand how to talk with them and work with them and that ends up being user experience people or in some organizations that aren't building commercial products and you want people you want to work closely with someone who's an engineer someone who spots feasibility concerns or constraints soon someone that can tell you that's going to be expensive but that's a great idea that will be cost effective to build it's this core team or balanced team or look if I walk into companies like I'll walk into companies like Atlassian who make Jira and Confluence and when I walk in to an area where a team works and they'll say this is where the triad sits the triad is this team of three I'll walk into a couple months ago was with Spotify they refer to this group as the trio and it's this group of three well it sort of busts up that scrum single product owner thing and yeah there is a person who's primarily responsible but if they're working on this stuff alone if they're building backlogs alone if they're doing this work alone it just does not work this is a smarty Kagan guy let's see if my sound works here there's a lot of short little video clips on YouTube where he's describing a lot of these concepts but he describes one of the secrets to why this is kind of valuable now even though officially the product owner is responsible for functionality the designer of usability and the engineer on feasibility the little secret in software teams is that truth is usually the best innovations actually come from the lead developer and the reason for that is that the lead developer typically knows what's possible better than anyone else so while officially you all have those responsibility really what's going on is all three of you are trying to identify that minimum viable product one last point in terms of location it's at all humanly possible you want to keep those three co-located right in the same location so we need to find video on YouTube for him but look the lessons learned are for years throughout the 2000s I kept running into product teams I kept running into scrum teams that were really effective that seemed like they were really doing well and I'll ask them well how's this scrum thing going and they say well we're doing scrum but we're probably not doing it right because the product owner really works closely with a lot of other people and he made all the stories himself in fact he works closely with other people and they figured out together and we know it's not scrum right we're trying to do it right but it works too well doing it this other way look it's over time that I learned that product owners lead a cross-functional team and they don't just work alone either they bring the whole team in they help them take ownership and everyone takes ownership and they're really inventing and innovating so look I moving forward I would say look if fine we'll figure this out now if we get the right people and we really work to understand our users and we work together we'll get this right we'll build software that achieves strong outcomes we'll be successful and look in talking with Marty just nonchalantly as we were driving somewhere working together with a client he said well if you're really good at this stuff you'll be right about a third of the time now I kind of freaked out because when he says you'll be right about a third of the time I think I'm wrong two thirds of the time he says yeah that's kind of the way it works so this stuff is hard you don't get it right and that's if you're good most people right maybe only if you're pretty good 20% of the time and everybody's right sometimes and I said well give me a quote that I can use I said usually 50 to 80% of all the software we ship fails to accomplish its objectives what he means by that is doesn't mean that the feature didn't go into the product it doesn't mean that everybody hated it it means that well for a lot of what we build it just not much happens people might use it a little bit they don't think the feature is great or fabulous they just think that's okay and what do you do you don't pull it out but it sure did not deliver the business value or the benefit you expected now for years in software development especially in agile development I've been seeing people cite this this standish group report the chaos report and it delivers these haunting facts that after surveying lots of projects that most of the features we build fall into this category of rarely or never used and a lot of people in an agile context I've seen say look an agile process will solve this but look it doesn't I've been through this a long time I'm able to fool myself into believing those features will be used and I'm pretty confident that all those other projects that failed everybody who put those features in didn't think they would be rarely or never used a lot of features fall into the category of they really seem like a good idea at the time maybe old enough to remember Clippy look you can take an opportunity we know that all products aren't equal if I can think of a product to build a portable music player a long time ago in the late 2000s people did and look Apple builds something that's great and Microsoft says look there's a real demand for a product like this we can build something that's great and there's a lot of people that really believe a product this guy believed it well enough that he put a tattoo on his arm but look the product did not do so well it was a dismal it was losing a lot of money and some people are super convinced this product would do well so that's a level of commitment you don't see for most product people now I mentioned to Teek in Edmunds.com this is Eugene he's the director of product management at Edmunds and I heard Marty say that you're right only a third of the time and I went back look I've been working with Edmunds.com for a long time they've been using this kind of approach for a while and they've been getting good at measuring outcomes getting good at figuring out whether the things they build work or don't work and I asked him look is what Marty's saying true are we right are you right so little and he said we'd be great if we write third of the times we'll write about two in ten times you can tell he's not a developer otherwise he would have reduced that fraction the one in five but what he said we've done over time is we've built a more flexible architecture it's become because we're wrong so often and we have to put things in and pull things out we have to very quickly measure whether they're good or bad a side effect of focusing on this stuff is being really good reversing our decisions really good at validating things are good the lesson learned here is what we're doing is hard we're usually wrong and instead of planning to maximize velocity we need to make better plans to learn this is another team at a U.S. company called Snaga Job you wouldn't have heard of them but their focus their target market are for employees that are hourly wage employees one of their bigger customers is Burger King in the U.S. so they help hire tens of thousands of Burger King employees and employees at Walmart and at other big retailers fast food restaurants things like that this is a team at a daily stand up and they're a little depressed and when you look at their board it's got the traditional columns of the stuff we want to build and they have some stops for further analysis or what would be a backlog grooming kind of thing and everything moves across the board until a last step where it's ready for release but things do not come off the board when they're ready for release they tag things with those pink stickies to say it's been released and because they're a dot com they can release, they can measure fairly quickly they leave things on the board and next to it are specific kind of things they're trying to influence RPV is Revenue Per Visit and they're trying to they're watching Revenue Per Visit they're hoping it goes up but in fact right now as a consequence of that feature it's starting to go down next to it on the card are specific smaller metrics that tell whether people are using the feature they're putting in they've got this explicit measurement step and every day at their daily stand up meetings about not just what we're doing yesterday, what we're doing today but what we shipped and is it working or not and nothing leaves that board until they have a discussion about what they've learned this is an agile team that's not focused on output this is an agile team that's focused on outcome and one of the annoying things is well what they're learning is crap this stuff fails a lot we put stuff in high quality and nothing happens things don't go up our bets our gambles we're all wrong this is the guy that leads engineering right now, Thomas Friedel they were working on this is a very disappointing story this is a team working on a major site revamp and everything they keep doing isn't working the site redesign they're working with is failing and they shoot on this forever look in theory the project was done months ago but it's failing from an outcome perspective so they keep pushing things into it to try and influence and talking with them he says yeah we eventually scrapped that we killed it we found that after spending all that time and all that money it wasn't going to work but what really came out of that was a strong architecture that helped us release quicker and learn faster so look I might have had the assumption going into this that great architecture is all about scalability and performance and what I'm learning that that's not typically it this is Thomas every organization I work with that's doing well has a re-architecting, re-platforming project going on nobody ever seemed to plan on the growth that they've got they all have fun naming conventions they had an original architecture which is a little messy and somebody who had a great idea for a better architecture and they called this new project to re-architect Eagle Eagle trundled along and then started to fail and they came up with another architecture strategy called Phoenix you know what a Phoenix is is a bird that is reborn in flames Phoenix will replace Eagle and now they've got the old architecture plus some parts of Eagle and some parts of Phoenix on top of that and then Phoenix is going sideways that's not working and so they come up with another architecture that they call Tucson now if you look at a map of Arizona Tucson is to the right of Phoenix and it's kind of going sideways from that and Tucson well eventually failed but out of this whole focus on quickly releasing and testing they came up with I'm glad they got a new paradigm but their new architecture is called Summersalt and Summersalt is quickly displacing things because they have a real need for it you know they always had pretty good scalability and performance but boy it's focusing on the need to learn faster that made a difference this is a guy named Bill Scott anybody ever heard of Bill Scott before any engineers have heard of him so a couple hands here Bill is goes way back in a lot of areas but built a lot of fame working inside of Netflix and Netflix is fabulous at building and measuring how well things work and just recently started with recently now a couple years started with PayPal and PayPal's got some really horrendous architecture and he makes the point that look engineers traditionally start by focusing on scalability and performance and reuse of all their components and look we spend a lot of time investing in designing reusable stuff when we haven't even proved yet that anybody wants to use it in the first place architect first for use before reuse and he makes strong points about architecting building architects that allow make it really fast for us to prototype and validate things he's got a book coming out on lean architecture let's go full back I find that this is Eugene again at Edmunds and when I talked to him about how this journey has changed him he said we used to have a content management system that did A.B. testing we used a product called Test and Target to do a lot of this but we found that that was too big too clumsy, too bumpy to do this and we found that we started rolling our own to test or measure and over time we just weren't using Test and Target anymore I hear a notable poster child for architecting their own ways of measuring our companies like Etsy if you're an engineer look at Etsy's engineering blogs to look at how they release quickly and do one button releases and testing I see the same story that organizations that really focus on learning fast different architecture comes out of it the lesson learned for me is engineer first for experimentation then focus on scalability and performance prove that people want to use it before you focus on the reusability so look alright we can't ever be right and we can start to architect to build things faster so look all we need to do is build measure and learn and we can use this lean startup mantra where our focus isn't on velocity our focus in lean startup is on learning velocity how fast we learn there's no story points in lean startup in fact it's one of the challenges that makes it hard to explain is how do you measure how much you've got done if in a lean startup situation you try 20 experiments and they all fail that's good you learned, you invalidated 20 bad ideas that you shouldn't have built to begin with you've minimized output put a couple stories in here a guy that works with Bill Scott at PayPal is Kody and Kody told me a long involved story about PayPal working with a feature to allow you to do Facebook sign-on for PayPal now people here everybody's used PayPal right? does everybody use that Facebook sign-in feature for PayPal? no because there isn't one it was a stupid idea it turns out they built the whole thing they built it in prototypes they had people come in and they proved that people could easily use it and people said we really like this feature they built it, they launched it in a limited area and what they found out is people could use it but suddenly when they were actually paying for stuff with their own credit card the last thing they wanted Facebook to know was anything about their credit cards or financial situation they were horrified by the idea of getting in with Facebook so after spending many months and building lots of stuff they realized this is a stupid feature we should have done something else to really validate that people would use this before we over-invested in it this is a guy named Bill Buxton he's notable in the user experience community he has an expression that I like when you look at prototypes the difference between high fidelity and low fidelity there's only right fidelity and wrong fidelity and the right fidelity is the fidelity we need to build to learn what we need to use now I'll advocate building very simple paper prototypes or other things that really look real but at some point in time you need to transcend to things that really work like they're real with things that look real you can put them in front of users they can step through them and you can see if they're usable and ask users would you use this and Kodi at PayPal did that but what we really need to get good at measuring is did they use it that's what things like A, B testing or 4 and that's what other types of experimentation are for and more and more I'm seeing organizations that figure out how to build and release skeletal features that just barely work that absolutely do not scale in weird ways just to get by just to test with a subset of users it's moving to really validating if our stuff really does work to experiment effectively starts taking time and money and it takes the whole organization's participation to get behind it let me tell one last story and let's see how if I can pull this together and I like this story this is about Edmunds.com Edmunds launched a feature called price promise and it turns out and I'm sure it's true here that one of the things that people really hate about buying cars is negotiating with car salesmen or negotiating full stop I realize I'm in India and I'm looking at the rash and I've seen him negotiate and it doesn't seem like he hates it but I hate it so they launched this feature called price promise where you can get a lowest price online and you get the promise from the dealership that you walk in and they say exactly that price they will not upsell you on anything you pay that price and you leave you give them the money you shake hands and you leave that's it they've got this idea and you can ask consumers do you like it they say I'd love that they can ask dealers would you give us pricing lowest rock bottom prices that we can publish online for your cars they say well yeah we would especially if people came in and bought it and they had asked people would you were serious about measuring they made partnerships with dealers in the Portland Oregon area of the US they had dealers send prices to them in spreadsheets they manually inserted pricing and information using SQL statements into their back end database they built just enough of the feature so that it would work in this Portland Oregon area and they ran for a few months actually seeing if people would use the feature actually buy things and you know when I talked to Eugene he says look in the past we would have argued and argued for months about this and then we would either gone all in and spent millions building this feature or we would have said no we're not going to do it now after getting good at this we're willing to pay to learn we'll spend a few hundred thousand dollars in a few months of development time and learn it's a pithy expression that a friend of mine David Husman uses that the difference between learning and failure is how much money you spend to do it look I can spend a few days of time and chuck it and say that was learning but it's hard to spend months of team time and pretend that was learning it's that looks like failure what I see with organizations is them gradually moving that bar up when it comes to prototypes we progressively scale fidelity we focus on how fast we learn how to scale and in the end I keep hoping that I'm really going to find a process that works and when I go back and ask I'll use a couple examples here look Eugene says we have the misperception that the process we follow the methods would produce success and yeah sometimes but most of the time it isn't we've got to just pay attention and what Thomas says is look we're looking for a process that helps us not fool ourselves so that's where I'm going to end here I'm finding that from the beginning I've built a career around thinking I'm good at development thinking I'm good at UI design thinking I'm good at making product decisions and it's been 20 plus years later but I'm starting to realize that that's a stupid strategy that as organizationally we're finally evolving to processes that help us learn faster that help us not fool ourselves and it is 10 o'clock and 29 minutes and I've left no time for questions or even time for a nourish to wrap up much so that's it do we have time for a couple questions sure man you're in charge that leave anybody anything that can ask a question about do you want to talk about books and I want to thank people and we want to shut it down thank you very much for listening to all this so in case you're interested in getting a personal autographed book from Jeff you just go out there you will have a counter over there they have about 20 copies of these books so you can buy these books at a much cheaper price than what you would get in the US or what price Jeff would get at a list price in the US and it's still a lot more than I'd pay to buy them here in India so it's a good opportunity to get a personal autographed book to go grab them thank you Jeff thank you very much