 All right. Thank you. Where do I start? Yeah, I'm sorry. For a lot of people that have seen me speak before, I will replay some common themes that I've said before and a couple new things. For people who've seen me speak before here, so some of the old stuff are gonna be playing free bird or something like that. I'll play some of the old things that I'm used to saying and I want to talk about product ownership in agile development or product management and what it used to be known as in the rest of the world. Now, I think agile development has and is currently screwing up product management and this is the design track and well, I wanna see if I wanna make sure I pull in theme so that you can understand where product thinking falls into design thinking but I'm curious how many people am I really talking to here? How many people in the room have been or are a product owner or product manager? Can you actually, well, stand up. Everybody deserves to see who you are. You can't see where the elses hands raise. Okay, it's quite a bit more than I thought. So look, almost half, that's great. So the rest of you, you can get a drink early, maybe. But no, I'm lying. Look, I saw Steve Denning speak today and he started with, well, by the time I got in there, one of the slides he put up is that agile is eating the world that 80% of organizations are trying to be agile while a lot of them are failing at that and he's right, it is, I was lucky enough to get a very early start with an agile process. I started with extreme programming in 2000. The term agile was coined in 2001. I learned I was practicing an agile process. But look, I will tell you, like I've told people a lot of times before, agile has bugged me from the very beginning. There were some things that were missing in agile development, but don't get me wrong. Agile development has made things quite a bit better. I wouldn't still be standing here talking about agile development if there weren't things about it. I really love. But ever since, from the very beginning, there were a lot of things that were left out of agile development and I've been working hard to try and put them back. Now a lot of areas, they've come back and a lot of organizations, I see really solid design thinking and product thinking work into an agile process. But one of the weird things that's happened here is that in common agile practice, it's still not there. There are people learning from off the shelf or when I say common agile practice, just a general way of working agile and left out of it are still all those things. I've been working hard and a lot of people here today have been working hard to try and talk about how those things work in agile development. But there's a tide of people teaching this broken common agile practice. And what I want to do in this talk is sort of set the record straight. I want you to understand how common agile practice or as we teach it today is messing things up. I believe there's a bit of urgency in doing this. I was at a conference called Mind the Product and I took this picture a couple years ago. At Mind the Product, this is a conference for product professionals, for product managers, for UX people, people that act as product owners and software companies. These conferences are held in London and San Francisco. I'll do one next month in some city in Germany that I've forgotten already, but there's one in Germany. And this conference was taken to the Barbican Center in London. There were between 14 and 1500 people in the room that are product people, big talks, lots of big names there, and there was one spontaneous pause line in the middle of the talk from this guy. And what he said is that I hate agile development. It's all points and velocity and no one ever talks about the customer. And in this room, 1400 plus people applaud. So the question for you guys to think of is why is it that product people are hating agile development so much today? Especially after agile development has been around for 17 plus years and we've had a lot of time to kind of work through this stuff. We talk about what we do to start fixing it and making things better. These things are gonna seem a little bit counter-intuitive. So point one here is to look to stop focusing on delivering software faster. Now, I need to draw the drawing that I always draw at the head of every talk and I will keep drawing the same thing until the language is so common that we actually stop talking about points and velocity. When I look at agile development, it always starts with ideas. And those could be ideas for products or features or enhancements. These are the things that in traditional software development are referred to as our requirements. But these are the things that in agile development that we take and turn into this backlog thing. Now, in traditional software development, we've got to turn all these things into something we ship. We've got to turn features or enhancements into the stuff that customers will get. And we're worried about how long it's gonna take. We're worried about time. And well, in traditional software development, we call this stuff scope, a backlog. We're worried about cost. And cost in software development is, oh, it's team time. It's how many people or how long is it gonna take to turn this scope into working software. We talk a lot about this iron triangle thing, this time, cost, and scope thing. The idea here is you can fix any two of these. If I fix time and cost, then scope is going to have to fluctuate. If I fix time and scope, we're gonna have to add more team members. That's gonna have to fluctuate. If we fix all 3D, then things like quality leak out. You can't call that a triple constraint when there's four things. So you never know where to put quality here. If you're working software development, you're worried about time, cost, and scope. Well, I think we all are, that's a big thing. Agile development is trying to fix a bit of this problem. By instead of focusing on how long that whole product or whole feature takes to work, we now work in these short two-week sprints. And in these short two-week sprints, if we worry about exactly what that guy was talking about, we worry about points and velocity. And that's not what matters. If you are responsible for a product, your job starts upstream. We've been in a design track all day long talking about how, well, now we need to start by paying attention to the people using the product we have. And when we do that, we find people that are unhappy. We find people that are worse than unhappy, that are angry or just confused and struggling. And it's the work we do, you're a product person. It's the work you do to pay attention to those people and come up with better ideas. Now, that's not an agile development. It's all points and velocity. Now, if you do identify real problems to solve and identify things that actually solve those problems, what we hope is true is that when we ship those product or features or enhancements, that those people are happy now. The problem is, especially when you ship products to consumers, is that people are not the same. Some people are less happy than other people and some people are just no pleasing. Now, these are the couple words I wanna add here is everything between the idea and the delivery, that's the output. That's what we pay attention to every single sprint. The end of every sprint, we show what we built. We talk about the quality of that. But what matters is what happens when things come out. The word for that is outcome. And we don't measure outcome in terms of how much we've delivered or even its quality. We measure outcome in terms of how these people change their behavior. What we hope happens is that these people see what we've built. They try what we've built. They use what we've built. They keep using what we've built and ideally that they say good things. Because if we ship something and they don't do these things, that's not good. In fact, well that's where the value starts to come from. Now if you're a product manager, you're responsible for understanding these problems and you're responsible for that outcome. Not just points and velocity and on time delivery. And it gets even worse than this. If you're a product manager, the problem actually starts upstream. It starts inside your organization. And your organization has a completely different set of responsibility. Your organization needs to sustain itself. Look, it needs to make enough money to pay you and other employees. And if it's publicly held, it oftentimes makes commitments to grow. And every company seems to want to grow. And your company's gonna pay attention to metrics and at minimum it's gonna pay attention to metrics like revenue and cost. And if those metrics are flat, that makes people inside the building unhappy. Now they are not unhappy because this software sucks or it's hard to do their jobs. They're unhappy because the organization isn't growing. It's just sustaining itself. It's not meeting its objectives. And hopefully these people give some guidance on what customers and users to focus on. And when we go all the way through this, if we ship something, what we hope these people see and try and use it. And if a few do, that's awesome. But if lots and lots do, that's when we start to feel it inside the walls of our company. That's when those metrics go up. That's when people inside the organization are happy. I'm gonna use a different word to talk about this longer term outcome. It's longer term benefit. I'm gonna use the word impact. And impact stuff that matters to your organization. You measure like return on investment or brand awareness or market share of brand awareness or brand sentiment. Look, your organization needs these things to sustain itself. But your users need something that solves their problem in order to make it make sense for them to buy it and use it. The tough thing for you if you're a product person is you are sandwiched in between these people that need to sustain their organization and these users that have concerns and complaints and your job is to make everybody happy. If you thought your triple constraint was time, past, and scope, those are three constraints, but the things you need to be worried about are output, outcome, and impact. Who are responsible for getting things built because if nothing gets built, if there's no output, then there is no outcome or impact. How, look, in agile development, well, how many of you are using things like stories in a backlog? Is this checking? One of the things we talk about are stories and acceptance criteria. And I'll tell people, I'll give people a quick recipe when coming up with acceptance criteria. It's just, when we're about to build something, ask the question, what will we check? Check to confirm this is done. Because at the end of the sprint, I wanna know this thing is done. And I also wanna demonstrate this stuff. So what will we demonstrate? It's good to talk about those things and agree on it before we start building it. But super important to talk about is how will we measure outcomes when this thing is put into use? If we talk about value being important, but we don't ever talk about how we're actually going to recognize or measure it, that starts to be a problem. We plan to measure outcome. For a product manager, that's what they're accountable for. And that's what these guys at Mind the Product are kinda wincing about. It's not stories and velocity and points, it's that. Look, if you ship a product and you can't measure if it was used, that's not good. Don't mistake velocity for value. Now, look, I wanna come back to this theme here. I need to talk about another problem that's embedded in agile development that really colors along the way we do things. We treat product owners and product managers and, well, our business like customers. And that might not sound bad, but let me see if I can explain to you why that is bad. First, I wanna describe what I'm gonna call a client vendor pattern. We use this all the time. We use this whenever we need to buy a service, whenever we need to hire a company to do something for us. In this pattern, one person plays the role of the client and somebody else plays the role of the vendor. Now, if you're the client in this model, it's your job, well, you need something, but it's good to know what you need. Now, recently, our car was hit by somebody and we needed to repair our car. So my wife played a role of the client in this and she had to go to multiple places to get estimates on how much it was gonna cost the car. She had to explain to the person at the auto body shop what she wanted. It was their job to listen. It was their job to understand. It was their job to figure out how and, well, how long it was gonna take because they needed to give back an estimate. Now, in the software world, sometimes we call these requirements and the customer gives their requirements and the person in the vendor role, they figure out how so they can give back an estimate. Now, in the outside world, and what happened with my wife is the auto body shop gave her a big estimate. She hated the estimate. She complained, she said this is a lot more than I thought it was gonna be and they tried to justify why it was gonna be that much and she left without a lot of confidence and I talked to her and said, go get another estimate, go someplace else and she went to someplace else and they gave her a better estimate. She liked that estimate better and she was with them. Now, in the client vendor pattern, pushing for a better price, a lower estimate is good, it serves us well. I'm in India now and my gut is that if I went to buy something in a marketplace in India and I did not push back on that price, that person in the vendor role would consider me a fool. It works well. We wanna push back on that price because if I do that, I get a better deal. I save more money and the person who's the vendor is their job to watch their own profitability to make a profit. Now, if we do agree on price, you know the way the pattern works, this person places an order and then these people end up seeing for a service, they end up seeing how right they were. They need to do the work and they need to make sure that they focus on that scope or what they agree to. They need to focus on time, they need to pay attention to their cost and they need to pay attention to that quality and then they ultimately deliver. Now, again, this is a pattern we use all the time. For better or worse, this is the pattern that is hardened into the way that most organizations work with IT. We call these people often the business and we call these people IT and IT falls into this behavior of pushing, excuse me, the business falls into this behavior, pushing IT for a lower estimate or a better price because that feels good and when they get a better estimate, it feels good and in real life, it feels good because we get a better value. But where this is a serious anti-pattern is when we both work for the same company and whatever you thought you were saving actually came out of your own pocket. If you push these people to deliver under cost, well, that hurts even worse. That's when the quality starts to leak out if they're trying to deliver something they can. Now, for better or worse, this same pattern is hardened into adult development. In agile development, we often call this person the product donor and well, we can strongly call this person the product donor and these people the team and it's the product donor's job to know what they want. It's their job to create a backlog and prioritize it and describe what to do next and it's the team's job to estimate and do that in this pattern. The vendor and the team is responsible for, well, they're responsible for time, cost and scope. And the product owner, for better or worse, they're the ones responsible for outcome and impact. They're responsible for those things and what I said outcome and impact, that's what we talked about is value. In this pattern, it's not the team that's responsible for value, it's the product owner that is. Now, I mentioned I started with this process called extreme programming in 2000 and when I started with in this process, I worked at a startup in San Francisco and my business cards had product manager but in XP, the XP role for this was called the customer and I was referred to as the customer and one of the things that freaked me out is I'd been in software development for a decade before that but were asking me to work in this weird way. They were asking me to talk to the team and tell them what I wanted and I didn't have to know what I wanted. I wasn't used to working this way. I was used to working in kind of a different way. I'm gonna draw a quick model that I like comes from a guy named Marty Kagan. Marty Kagan has written a book on product management. Right now, say the best selling on product management is called Inspired. He's been in a lot of technology companies but he spent a lot of years at eBay. He started at eBay as the third product manager hired there and then he was with eBay up until about 2008 and helped eBay grow up to the size that it is now and he was responsible for product management and user experience there. Now he said, look, if you're a product manager, first, he considers the job product manager exactly equivalent to the job product owner. It's the same job, he just would say product owner is the agile word we use. I've got to remember that agile development sprung out of traditional IT where in traditional IT we're oftentimes building things for our own internal use and we don't oftentimes think about or call those things products. If I work at any bank, with any bank for instance, and I say the word product, they're gonna tell me about checking accounts and different savings accounts. Those are a bank's products but the website, the mobile application, those are channels, that's the way they reach their customers. So product owner turns out to be a neutral word. But if you work at eBay, well website is their product. Even though that website tells products, they're okay with that word being cyclical. Now your job is to build a successful product and a successful product is an intersection of free big concern. First of all, that product means to be valuable. That means worth investing in. Look, to understand value, you need to understand your organization and the way it disdains itself or makes money. You need to understand your organization's vision. You need to understand your organization's strategy because while you may not be responsible for coming up with that vision and strategy, you're responsible for carrying it down to the product you're responsible for. You also need to understand your customers and users. You need to understand their challenges and you need to understand the market. The market is composed of these customers and users and competitor, other alternatives or other choices those customers have. Sort of get valuable right, your org, the market. But this will help you maybe identify opportunities or features or things, problems we could solve, but whatever solution we build needs to be usable. It means if they choose it, then they can start to use it and to do usable right, you need to understand users and not just understand them, but the way that they work. You need to understand where the problems are because you need to come up with solutions. Not by solutions, doesn't just be naming the feature, it means the design of that solution and those solutions need to be usable and it isn't just figuring out, well, you need to design those solutions and you also need to be able to prototype and test those solutions. Now, that's what you're gonna need to do to get a successful product. And it's even worse than that because any idiot can come up with super cool ideas that we can't afford to build, what you decide need to be feasible to build given the time and goals that we've got. So if you're a novel organization, feasible means you know how to code. You need to understand, since all building on a file of legacy code, you need to understand your code. You need to understand the hidden stuff in code, the stuff that users don't ask for, it's the stuff they complain about if you screw it up, things like security and scale and performance, things like that. And furthermore, technology ages super fast. If you want to build technology that you want around for a while, you have to be aware of technology trends. If you are just now figuring out cloud-based architecture, if you're just now starting to think about the implications of AI on your software, if your company still isn't buying that whole mobile first thing, you know you're behind. If you're building software using today's technology, you're likely buying. And in fact, understanding what's possible or soon to be possible, that's where real innovation happens. If we can apply technology that hasn't been used yet to new problems and solutions, that's where real innovation happens. So look, what I'm saying here is that if you are a product manager working for Marty, this is all the stuff you're responsible for. When I talk about all this, you're supposed to see this as impossible because it is. In fact, if you're a product manager, it's important that you, well, it's important that you work very closely with someone who does understand users and the way they work and think through all those things. That question is a UX person or a designer. And it's super important that you work closely with somebody who understands technology and how to build this stuff. This is usually someone who is a senior engineer. In fact, it's this core group of people that works really closely together. And in fact, what we want are product teams that have two capabilities. They need to be able to decide together as a team what they should build and they also need to be able to build it. You don't separate deciders from builders. And when I started at a product company, when I started with extreme programming, as the customer, I actually sat in a different room than the team. This was freaky for me. I've always been sitting with my team and when it came time to deciding, I know that I always work closely with people to make those decisions because I did not have all these answered. As a team, we were responsible for our product success. We were responsible for outcome and impact, not just time, cost and stoke. That's a very different mindset. Now, when you've got a lot of speed and you're trying to decide fast, one of the concepts we dust off in you a lot is the idea of three or four people that work really closely together. My friend calls that a core product team. Now, let me show you just an example here. Look, that's somebody I was working with over the last couple of years. Name is Belinda. She's a product manager. Her company builds software for real estate property management. And I asked Belinda to send me a picture of her core product team. I wanted to use her in a presentation and talk about her. She sent me a picture that had these other two guys in it. Terry is their UX designer. And Nathan is a tech lead. The picture she sent me was this picture. I thought it was a little bit weird, but she didn't feel good about sending me a picture just of her core product team. She sent me a picture of her whole team, except for the guy on the right who was their sensei. And this is kung fu practice. I don't know why she chose to send me that picture. But for her, there was a strong identity for her as a team. She was, well, it's important. She's a small person. She's kind of an unassuming, but one of the things we look for here is strong leadership. We look for somebody who is a strong collaborator. Product owners are not the people who decide to buy themselves. They're not the people who do everything on their own. They're the people who lead. Now, I told you those people that mined the product were bent out of shape about this. This is the logo for mined the product. And what that logo refers to is this intersection between user experience, technology, and business concerns. In fact, well, this picture that I just drew for you is the freaking logo for the organization. They're trying to say it takes all of us. And while product leadership is incredibly important, it isn't a job that's done alone. But if you are working as a product owner or product manager, it's important that you lead a cross-functional team. You'll work with a core team. But one of the words that's just finally, one of the things that's finally a piece of dissonance that is finally more clear to me this year than ever before is that term product owner does not make sense. Teams own products, not people. And teams should feel accountable for those. Now, let me get to the third thing here. Don't assume that you're building something worthwhile. Now, a lot of what's been going on here the last couple of days is focusing on that. It's focusing on evaluating that we're building something worthwhile. Let me go back to this model that I drew earlier. The annoying thing about this model is this stupid idea thing. When we're, well, first the problem with ideas is that everybody has them. You talk to any customer or user. They will tell you their ideas on what you should build. You talk to any business stakeholder in your organization. They will give you their ideas. If you ever thought for a minute that those ideas came from the product owner, you're nuts. For the people that are product people here, you know that's true. It's listening to all these people where the ideas come from. And if you're successful, you deliver a product that people like, it only gets worse. There will only be more people making more suggestions and more stakeholders. So when we talk about these ideas, problem number one is that there is always too many. Now, problem number two, this is a real tough one, is that most of these ideas suck or won't work out. Now, if you were working for a tech startup right now, if I were to ask you what is the success rate or what's the failure rate of tech startups? Pretty startups fail. Don't yell at a number. Yeah, I heard a couple 90s and things like that. I've seen studies that put it as low as 8%. The study I saw presented at a South by Southwest conference. They had 130 companies that participated. All of them had received venture capital funding in excess of $100,000. The study rate for two years. And by the end of two years, 80% of those companies are out of business. And the study stopped then, so we don't know what happened after that. The truth is that most new product ideas fail. Now, a weird thing happens. When you start adding features or enhancements, suddenly everybody thinks from now on our ideas are going to be awesome. And they will add more features and enhancements simply because people ask for them. Now, there are studies that also show that 60 to 70% of features put into products fit into the rarely or never used category. How many of you use tools, Atlassian tools, tools like JIRA or Confluence? Good, then if you use JIRA, you know what a product looks like where 60% to 70% of the features are rarely or never used. Atlassian has a big strategic problem. They have been listening to customers for a very long time and putting in the things that customers ask for. And they're just now recognizing this is a problem. We've been doing what people ask for. We've been giving people what they wanted but not what they need. And that's a problem. They're recognizing this because they're getting a lot better at measuring outcome and impact. And they now recognize we have a lot of features that hardly anybody uses. But the problem is a few people do and now we have to figure out who do we piss off by pulling features out. So look, there's a way that we deal with this uncertainty that we expose it. These people are with a U.S. company called CarMax. CarMax sells used cars. Now they don't just sell used cars, they're officially the largest used car dealership on earth. They have, well when I put the, over 170 locations, I'm sure it's close to 180 by now. When I put this slide or put the date in the slide, they had 14 billion in revenue but I think that was 2016 revenue. I'm sure it's in the 15 to 16 billion dollar range and they sell well over a million cars a year. I don't know how many cars they've sold while I've been talking here but they sell a lot of cars. Now these people are people that are product team at CarMax but there is a U.S. person and a product manager and for better or worse, I'm standing next to the engineers but who they're talking to are leadership. Now the guy on the right, his name is Shamim, he is the CIO of CarMax. If you look up CarMax and CIO and Forbes magazine, you will find a cool article or an interview with this guy about how CarMax is embracing product thinking. Part of embracing product thinking is embracing this thing called a hypothesis. Now a hypothesis, look if you've written user stories before, they have this nifty format where you say as a user, I want this feature so that I can get some value. Now the hypothesis says, well it's great to know that users want this but a hypothesis says if we believe that if we provide this type of person and their hypothesis is focusing on the name of a persona, bound up in that persona are some people and problems that she has. So look, they believe that these people exist and they have this particular problem and if they build a particular solution for her, now these people are in the finance team and they're working on a way to do away with all the finance paperwork it takes to buy a car and using a loan. They believe if they provide decisions to people on all cars, they can buy a loan and figure out, well they can start looking at payments on cars, not prices of cars and that's gonna help them buy faster. And look, they believe if they provide this capability, people will be more empowered, they will buy faster but ultimately that's gonna result in this business impact of, it's an increase in conversion. It means that they sell more cars. Now look, hypothesis statements, they're called this but don't mince words here. Hypothesis is just a flashy word for our guesses or our bets or the things we believe are true and the lean UX or lean startup community has started embracing this word. Now almost always when I start using this word, somebody will come up to me and say this is not, you're not really using it in a scientific way and when you start testing this thing, it's not really science either way you're doing it and I decided that I don't like that hypothesis thing or I won't say we believe, what I like is the term bets. So look, I like telling people to say we bet that this is true. Now, if this is what CarMax bets is true and well without every time I show this picture, somebody will say to me, why do these people have such grumpy expressions on their faces? Now it's a longer whole talk by itself but their bet is turning out not to be true. It's turning out by offering this capability to people. They end up with something that is actually causing, making it harder for people to make decisions. It's not going to make them more money and that's what they're talking about. Now, if we've got these beliefs, these beliefs, all these beliefs do extend to we believe we can build this scope on time, this cost but even more importantly, we believe that these people will use it, try it, see it, try it, use it, keep using it and it'll benefit our organization. That's the big bet that we've got. Acknowledging that that's a bet is the first step. There's a mantra that comes from people who do this where the mantra that I like is design like you're right and test like you're wrong. When we, because we have bias, well actually let's go back to this. I said that almost, there's always too many and most of our ideas suck. They won't work out. We add one more third thing to this. Has, are there start up accelerators or incubators here in Bangalore? I see a couple of people nodding. Has anybody ever visited a startup accelerator before? Well, there's a, oh basically it's a co-working space where lots of startups are working closely together. They usually get some guidance and a little bit of funding. Anybody visited one of those places before? So there's a few people that have. If you've been in there, they must be super depressing, right? They must be like walking into a hospital or everyone with terminal diseases. But if I told you that 80% of the employees in your company were going to be laid off in the next two years, your company would be a pretty depressing place to work. But when you walk into these startup accelerators, people are super happy and super upbeat. They're working really hard. They're really positive. And for the last of me, I can't figure out why or well, actually I can. If you talk to any one of them, it's not because they're stupid. It's not because they make the most of the best accessible. In fact, if you talk to anyone, it will tell you, yeah, most of these startups here will fail. Most of these ideas aren't going to work out, but not ours. We believe that our ideas are awesome. And when it comes time, this is just to want to focus on pushing more of this stuff out faster. And well, this is your challenge as a product person. This is what you know. It isn't time, cost, and scope. Your job is not to build more faster. Your job is to build less. Your job is to minimize that output at the same time that you maximize that outcome and impact. And one of the ways we start to do that is start to well acknowledge that what we're working with our hypotheses that we're not sure. And sometimes we talk about designing like you're right. When we are coming up with ideas, we do that with an optimistic hat on, with what we call the yellow hat thinking. But when it comes time to test, we black hat our ideas. Now, if you've ever seen an American Western movie, you know that the bad guys, the one in the black hat. I don't know, what do bad guys wear in Bollywood movies? Do they wear? Is there any way you can say again? They don't have hair, is that what you said? All right, Alan, are you still here? But look, there's ways, there's cues, but it helps to put on a black hat to say, okay, we believe this is true. What are all the things that could go wrong? In fact, if we start with that hypothesis, we build this weird kind of backlog and it is not filled with all the things we wanna build. It is filled with all of those risks, all these fears, all these open questions, and all these assumptions we're making that might be wrong. I'm gonna call this a learning backlog. And well, what we then focus on is the riskiest, the scariest, the thing that bothers us the most, and if we can acknowledge that as a question, as something we need to learn, then we figure out what is the least we could possibly do to test that. Now, when we start looking at ways to test, we've got a lot of choices here. We've got choices that range from fast and cheap to things that are expensive and time consuming. But basically things that are gonna cost us a lot of money and things that are gonna cost us penny. Now, I've been watching people delivered design discussions all day long. And so look, I have learned that if we believe that people have a particular problem, and I wanna validate some of the beliefs I have in one of those personas, then what I end up doing is, well, I do things like easier interviews. I can go out and observe people. I can do things like a survey, but those can be a bit flawed. I miss out on a little subjective information and it's not a two-person conversation. But if I believe my biggest concern is that people actually have a problem we're solving, I can do that. Now, if I be the solution they have that works, some of the things that I can do is just put a prototype in front of them. Not something even good enough to use, something that allows them to imagine if they had that solution. And the reaction we're looking for is, that looks awesome, when can I have it? And sometimes early prototypes aren't about usability testing, they're about testing value. Oftentimes we can go to higher fidelity prototypes. We can start, sometimes it becomes necessary to let people see something happen at work like it's real. My friend Marty Kagan refers to this as a live data project. And we can, you know, somewhere in here, for some companies there are things like AB testing. If you're building consumer-facing products, that's one thing that works. And at some point, you might actually deliver, well, a working version, but only to a few customers. It's common for enterprise-class companies to create something for a small pilot group of company, customers. And in this working version, I don't need to worry about scale. I don't need to worry about performance quite as much. In fact, for this working version, I'm gonna deliver to a small subset of people so it doesn't even need to be perfect. It needs to be good enough that I can start to learn from it. Now, of all the things we can do to test a hypothesis, the thing I'm gonna put out here on the right, the very most expensive thing we can do is to build scalable, shippable software. Or what in agile development is often called potentially shippable software. So if you've got a belief that if you build something, or those people at CarMax, what they were sitting on was something that would cost them millions of dollars, build and take well over a year, given what they were trying to do. If they strongly believed it, they could launch a product, project, and start to build that thing. That's the most expensive way to test an idea. Now, I'm going to put this in a two by two. This two by two comes from a guy named GIF Constable. Now, I've mastered it a little bit. GIF wrote a book called Talking to Humans. It's a really thin book. It's on doing user research. I would tell you to rush right out and buy it, except, and you can, but it gives it away for free. It's talkingtohumans.com, so just download it. Now, the idea here is that the vertical axis, that's about, well, risk. For instance, we'll go higher on this axis. Actually, if our risk is pretty low, if the risk of doing any of these things is low, then it would be high on the axis. We also look at it as, well, confidence. Confidence is high up on this axis. Now, the question I learned from a guy that I knew came out of Adobe, and he was working at Adobe in the Bay area, and he would always ask people to what would you bet me, question? And he would say, well, if you've got a hypothesis, would you bet me lunch, that this will work out? Or would you bet me a day's pay, this would work out? Would you bet me your car, this would work out? Would you bet me your house, if this would work out? Or would you bet me your retirement, savings, this would work out? For some people, their car might be up here, their retirement savings may be down there, but you get the idea. Now, the way you figure things out, the way you figure out your next best test here, is to ask, well, what would we bet that we're right? Now, if the most we're willing to bet is lunch, and we're building scalable Shipple software, it puts you right here on this chart. This is the stupid zone. Stupid and expensive. And some companies are really rigorous about doing line work. We all interview customers, we always do prototype tests, we always do around the usability testing, we always do these things, but if we know we are solving problems, everybody would bet that we can just change the skills we love, and solve the problems. Now, I might put you up here in this case, this is also stupid, it's wasting time. On this chart, the safe zone is right up the middle. It's here where we have our safe move. Our goal is to scale the amount of time and money we invest with our confidence. Now, if I go back to this, the way this loop moves forward, if we figure out, well, our way to test it, we then may create a test, and then we may ideally get out, put it in front of customers, and observe, watch them use it. And from that, well, when we're done with that, we get back data. And we use that data to change what we believe was true, to change our hypotheses, to change what we believe we need to learn next, and we go around this loop. Now, in lean startup lingo, this is referred to as a build, measure, and learn cycle. In lean user experience language, this is referred to as a think, make, and check cycle. But all of these cycles, well, they're called validated learning cycles. Validated learning is something we try and keep fast and cheap, and the goal of validated learning is in potentially shippable software. The goal of validated learning is information. And, well, we focus on learning velocity here. See if I can pull this together. Look, when we are trying to de-risk something, if you're a product person, it's important for you to communicate your product bets. We use small bets, experiments, tests, learning activities, to remove uncertainty from our big bets. Because we know that most of our ideas won't work out not the way we thought they would. Now, the last two are gonna go fairly quickly here. Look, one of the things we do is leverage data. We need to be able to measure how things work. But I need to tell you, stop spending so much time in the office. Look, when I go through my hard drive, I have lots of pictures of me sitting with people who use my products. I used to build software for brick and mortar retailers for retailers that had anywhere from 100 physical stores to thousands of stores. My company was bought by salesforce.com last year, so it's now Salesforce's commerce cloud. They got bought for $2.8 billion and the people that I was working with, they are now product managers at this company. And when I look back through my slides, look, old pictures, and you can tell this is old because this is what monitors you still look like. I have lots of pictures of me sitting and talking and working with users. So just a couple others here. Look, this guy is a stock portfolio manager and he's managing a fixed income fund and sitting with him and his team that we figure out how they actually work and what this chaotic environment is like. And he works very differently than this team of people that manage an active equity fund. And those differences in how they work start to matter a lot in how we build this product. Sitting with someone desk side as they show us how to do their job is one of the things we do to learn. It's talking with people, it's being there that makes a difference. The guy on the right is a product manager. He works for Kodak and yes, they're still in business and I'll show one other picture from Kodak. The guy on the left is a designer, but he doesn't design user interface. He designs portable printers. The guy in the middle will take a picture of you and your family in front of the gateway monument in Mumbai for 30 rupees. He uses portable printers. That printer does not look very portable to me. In working with them, they spend time out there in the hot sun talking with people that use their product. They build friendships with these people. This guy on the right, he's a product manager and he manages a kiosk that goes into grocery stores and pharmacies. The kiosk I use, I bring my camera in or a USB stick and I can print out a high quality picture. The person standing to the left is a UX person. The person standing in the back is a lead engineer. He's most in taking notes. The short white-haired person is not on their team. She's been trying to get a picture out for the last hour. Now, one of the things they've learned is those short white-haired ladies do not pass interview screeners. They never get recruited. They never show up for usability tests. They don't respond to ads for test people. They found that they didn't get a real perspective on people until they went out and actually watched them work and understood it. Now, this guy on the right has a lot of demands for adding more stuff to this kiosk. It's not enough to just print your picture out. This company is giving him a push back to make sure that people put it on coffee mugs or print out posters or put it on calendars. But at the end of this day, we came back, I took the picture and we did a little bit of reflection and he said, this has been the worst day of my life. And I never understood how much damage I've been inflicting on people. And I promise you, if you talk to the people using your product, you will never prioritize a backlog the same way again. When we're working, we're complimenting subjective or things that help us build empathy with our customers with objective data. And we're not doing it alone. We're not sending UX people out to do it by themselves. We're not sending the product managers to do it by themselves. We want teams to go out and do this. Now, look, let me give you the, actually let me summarize this and give you the last point because I'm over time here. But everyone needs to spend face-to-face time with customers and users to build empathy and insight. It's not just a product owner's job. Look, if there is this build, measure, learn mantra or this think, make, check mantra, there are things that we do that are building things. But by build, we don't mean build software. Build, we mean things like prototypes and well, other kinds of tests. And by measure, we actually don't always mean analytical data. We mean we can observe. We can talk to people. And by learn, well, that's what comes out of that. When we learn, we make sense of what came out. We synthesize what we've learned. We build simple models to show this stuff. Now, when I see teams that are through this kind of work, they do like these people did, they bring developers out with them because we want the whole team to have empathy and insight and understanding. The guy on the right is a product manager. The guy on the left is a customer. The guy talking to the customer who's been using a prototype and talking is a QA person. It used to be the QA person's job to just go out and take notes and make sure that they were getting understanding. But over time, this QA person found that he had, well, he had fun doing this stuff. He had an aptitude for it. He liked doing it. And more and more of this guy started leading interview. Now, this picture is really blurry for two reasons. It's an old picture and I was sitting way across the room and I took the picture and I had to zoom in on it. And I'm sitting across the room for a couple of reasons because I want them to be able to do this stuff alone. And because I'm sitting next to the lead engineer who is deathly afraid to talk to the customer. But once a week, they go out and do this and they want him to go along. They want him to see what it's like to pay attention to the body language and when they finish this interview, they'll sit back together and they'll share notes. They'll compare, they'll talk about what they've learned. This guy, I actually met here at this conference years ago. His name is Sharif. He's a product manager at Lassian. When I met him, he was a product manager for Confluence and now he leads all their cloud-based stuff. Been there for a long time. If you Google his name, you can find webinars that he's taught his names all over the place. But I remember him saying to me that I don't do customer interviews without having a developer in the room. The number of light bulb moments, the empathy gained, it's phenomenal. And when we come back and synthesize, it's quick and dirty models that contain insights, things like personas that we might have started with assumptions, but now contain data that we've pulled from the customer interview. Things like simple journey maps. I'm sure that people learned if anybody was on the, somebody would have learned journey mapping today and others may already know how to do that, but those help us understand how people work today. More of these simple personas. Some people here may have participated in things like design sketching activities. If you've ever heard of something like a design sprint, but this is where we don't send the designer off the design UI. We ask everybody to take a stab at what they think user experience should look like. And then we come back and share. When we get done with this, we pull everybody's ideas and it falls on, it falls on UX people to make sense of what are the good ideas and what aren't because they're not all good. And I talked about building and working with simple prototype. Look, the important thing to recognize is that this is all work that we can, the team can help with. And for organizations that are working this way, they use sprint planning to talk about the hypotheses that we have in play right now, to talk about how team members can participate over the next couple of weeks and to set a time box for the amount of work they'll do to help with discovery work. Every day at a daily standup, we talk about not only what we're building, but we talk about what we're trying to learn right now during this two week sprint. And every time we do a sprint review, we talk about what experiments we've run, what we've learned and, well, which hypotheses are going to move forward. But discovery work isn't the stuff you're waiting for the product owner to do. If you're a product team, it's the work that the team has to do. We keep it visible, we talk about it, we plan for it, we review it and we reflect on it. For teams at Atlassian, this is the way that they work. For teams at CarMax that I showed, this is the way that they work. For teams at that little company, Hightower that I mentioned, it's the way that they work. And for organizations I work with, this is what they're striving for. They're striving for a product-centric view of agile development. But if I summarize all this stuff, your job isn't to build more stuff. It isn't, agile shouldn't mean build more faster. What we're trying to do from a product perspective is to minimize how much we build, but maximize and measure that outcome and impact. Products, product teams own products, product owners, product managers lead cross-functional product teams that can both decide what they're building and actually execute and build it. We test our beliefs by using small tests. And throughout the day today, you've heard a lot of people talking about small tests. And because we're wrong so much, that's why we do this. And look, it isn't just hands-off, it's hands-on. We build empathy. We actually take team time and get out. And then finally, because we know we've got to do this, we incorporate doing all of that stuff into our agile practice. Hey, let me ask you really quick, is out of all this stuff that I mentioned here, how many people are working in an organization where they do a lot of or all this stuff? Or let me ask you on a scale of one to five, how much of this stuff do you do? Where five is, we do all of that stuff. I don't know why I bothered to listen to you for the last hour. Three is, well, we do some of that stuff. And one is, you're smoking crack, no one can do that stuff. On a scale of one to five, how much of this stuff do you do? One, two, three, in number. I'm looking for twos and threes, I see a four. And if you believe, and I see a fist, I see none of the above. If you believe this is impossible, I should have had you stand and look around, but there are people in the room that are fours that are striving to do this stuff. More and more, I'm seeing a big disparity between the product-centric organizations that have let go of common agile practice and incorporate a lot of this stuff. And, well, people that are being sort of dogmatic agile. There's a big difference. All right, I know this, my talks are always too long. They always have too much and there's never any time for questions, but I will still take questions. But thank you very much for the last hour.