 My name is Sophie. I will be your host for this session. We're only going to talk about user research on Agile. So if you're here for anything else, my wonderful colleagues in other rooms are there for you. You may have heard Noresh introduce this event. You're free to go where you need to go so you can learn. So if this isn't rock your boat, don't worry about stepping in the front. This is an awkward room. Just go get what you need from this event. That's the most important thing for me. I'm a user researcher. I'm a UX designer. I'm a product owner. I'm an innovation coach and I'm an Agile coach. All these things work together. You need to understand all these things to do the job that I do. And I go help clients all over the world make amazing products and ship them. I'm not interested in things that don't get finished. I'm not interested in things that don't get delivered and that don't get into the hands of the customer. So my game is to get to delivery and to get as early as possible something tangible in the hands of our customer. So this is all about how to do this when working with developers in Agile Cadence. Really bringing user research and product definition to the same pace, which is not easy because user research has a very unfair reputation. It's very expensive bad news. And it's often very expensive bad news because user research is done towards the end of the build when you have enough and it's all about testing just before you go live and any bad news you get at this point, there's no time, there's no budget, there's no resource to do anything other than stop everything and start crying. But you can fit into the Agile Cadence and I'm gonna tell you how. Remember the manifesto? Bit of a refresher. User research can help you with two things on the manifesto. It can help you with working software with the definition of it is working software for the end user. So it doesn't just work. It works exactly as they need it to do what they want to do. It's quite different to QA testing. QA testing the functionality does what we said it would do. User research testing, the functionality really delivers the outcomes that the user needed from it. And customer collaboration where we extend the definition of customer not just to team customers but to really the end customer which may be a paying customer or not. Very wide definition. Are we all clear on customer user, paying customer, non-paying customer? What you really wanna do is engage your user and customers and you want to deliver working software that delivers delight, not just efficiency and efficacy. You want to verify your features before it is expected. You really want unbiased objective insight generation and you really wanna get to that satisfaction. This is where user research helps with in terms of Agile. Now, Agile can no longer be your excuse. It has been an excuse for a very long time. You know, we're very fast. We're very close to the delivery deadline. It's very technical. Well, actually, this is not an excuse for putting user research outside your cadence. If you start pushing things outside of your Agile cadence and mechanism, what happens? You've created waterfall. You've created your little fake Agile bubble disconnected from the reality of the business and the product. Not great. It's gonna come and bite you very quickly. It also creates, by the way, documentation and already a silo. And it also means that the research is done in a cadence that's different to yours so you can't integrate its findings and it's gonna really come and hurt you. And you also cannot have the excuse that research needs to be done outside the team. I'm a proponent of designers and researchers need to be a part of the team, sit with the developers, with the engineers, with the tester, with the product owner. This is the best place for us as researchers and designers because we're on hand for any question, any time. So I'm in India this week. I've got a developer in Frankfurt and he pinged me a couple of questions yesterday because he knows that even if I'm here, I'm close enough. He knows he can ask no matter how little or small but these are the little insights when you share them quickly, your delivery starts accelerating because you get validation immediately. Your validation loops start getting very, very small and tight. So no more excuse for not doing user research with an agile. Especially not when I'm done. So give me another 35 minutes. Now, I'm gonna give you a really quick refresher on user research and if you need any more details, don't hesitate to ask. The research cycle has five elements. Who can throw me these five elements? Anyone? Anyone? Sorry, say again. Identify who you're going to. Yeah, this one. Identify, spot on. I wish I had candy or something. What else? What's next? Do you think? Empathize needs, create personas. What do these things help with? Are you doing research already? Are you meeting with your customers already? Not yet. What do you need to do before you meet with them? Understand their needs? Well, that's the point of the research people. What sort of communication? What kind of communication are you gonna do? Well, we're gonna do research. This is a user research cycle. Get to the, what else? What specifically are we gonna do now? Meet, not quite. I'm gonna help you. We're gonna prepare how we're gonna meet them. So communication a little bit, but we really wanna say, who am I meeting? When am I meeting them? Am I meeting them in a Starbucks? Am I renting a usability lab? Am they my meeting them in their workplace or in their apartment? I'm gonna stop random strangers on the street. Do I need to book their time? Do I need to pay for their time? Like, there's prep. You know, if we decide that we want to go on Friday tomorrow to understand how we can improve payment to get into the public transport system here in Bangalore, we need to prepare a little bit. Then we do the next step, which is we actually run the research. So that's the actual meeting the people and asking them the questions and showing them the prototype, showing them the product, observing them, taking notes, recording, taking photos, taking it all in. Once we've done this, what do we need to do? Collate the information. Yeah, we need to analyze it. An analysis is a big, big complicated part. The more information you gather here and the more analysis you have to do there. And finally, what do we do? There's an empty spot over there on the side. Solutionize. What did I hear? You design. Remember, we're working with an agile team or we're working with people. What do we do with what we've done with the analysis? Improve it. Okay, I'm gonna take a little bit step back. We wanna share it. We wanna share it. Like, research needs to be shared. And the sharing, like literally the research and the analysis and the findings need to be shared. Once you've shared, then you can use those insights to decide what you do. And what you do is very simple. You design, you improve, you delete. But this happened after. You socialize your research. This is what we learned. And they're very straightforward. Step, step, step, step, step. You can't do them in any other order. You can try. It doesn't work. So now when you've got those five steps, you've got three more important things to pay attention to. The first one is that you need to involve everyone into getting what kind of research we want. You don't start your identification by going solo and saying, well, I'm the user researcher. I know what we're doing all. I can, eventually I might ask the product owner, but user research is not a cowboy sport, okay? You're not a lone rider in this sunset. So work with the team, identify what research needs to be done to serve these people. Because they're the ones who're gonna benefit from it in the end. So to make sure that they don't tell you, well, that was a bit unnecessary. We can't do anything with the insights you've given us. Wow, that was a lot of money for not much. You wanna make sure they're present at the beginning. These can be the internal team, the greater team, the product strategy people within your organization. So whoever is going to benefit from the insights. And you also want to involve them a little bit in the run. I like to take people alongside with me doing research. So they actually see a genuine, authentic, wild user in their natural environment. I know they're rare to find. They usually roam outside of your office building. But you need to be very attentive and place camera traps together. How are we doing so far? Yes, so, so. This is really not what I wanted. I think I'm gonna go. Okay, let's continue. Now that we've got this research cycle, I wanna ship that back into the context of product life cycle. Because you do different research at different times. And I really wanna make that clear too. So I'm gonna break it down in very small chunks so that in the next 10 minutes, you will totally crack it. A product life cycle. So I'm not talking about a business, but a product life cycle has pretty much six steps. How are we reading that in the back? I'm gonna read them to you, sorry. You start with an idea. You're gonna explore if the idea is worth going. You're gonna do your experiments to make sure your idea has enough traction that there's validity in doing that. You launch your product. And then if everything goes well, you keep going. You grow your product. You grow your customer base. You start growing your whole business. You get your product to maturity. It's got cruising speed. It's bringing consistent good results. And then at some point you'll retire it. You take it down. So from birth to death, that's how a product goes when it's well managed. Any surprises in the product life cycle. Okay, now we're gonna do research at pretty much all of these stages. There is one stage at which we don't do research. Which one do you think it is? The retirement, exactly. Maybe there's another product replacing it, but that part is not, that the budget would be on the other product. The effort is on the new product, no longer. We're just, we're gonna, sales can manage the customers. Engineers can pull out the cables from the servers. User research can move on. Everything else we're going to do like this. In the ID stage, we help with understanding what goes on. Who are these people? What do they need? What are their context? Then as the business experiments to understand and to find that product to market fit, we start experimenting more. And then when you start launching and when you start having access to more people and you start getting a better idea like you just research more. In growth you start researching a bit less. You're growing your product, but it should not be as intensive an effort as it is here when you are all rockets blazing. If you picture, if you've heard this morning's talk, to get a rocket out of atmosphere is where you put the biggest power, those biggest, the most fuel is used for that launch sequence. Same thing for business. Like this is your high fuel, high effort, high G effort. Then you grow and at maturity, you wanna keep a bit going on to make sure you're not completely off the mark. And then your researcher can go on. I'm gonna break you down why you need so much research at this point, but bear in mind, you're gonna see this build in the next few slides. So let's put it all down. And this is what happened. When you have a new product, the thing you start with is risk. It may not work. Nobody may want it, nobody may like it. We may not be able to build it. We may not be able to market it. People may not be able to afford it. Lots of risk when you start in this idea stage. So part of your research is going to explore and minimize the risk. And as you understand more of what goes on, the result will be that the risk will go down. The reason you're continuing is because you've seen your risk go down. And once you're in growth, you're at very low risk. Like, you should be. Ideally, your risk should have lowered even there, but let's be realistic, it happens there. Once your product is maturity, you should have zero risk. You know your product is fit for your market. You know who your users are. You just need to sell more, more often to more people. So that's the risk part. Let's look at your investment, because the finance plays a lot in your product. You want to start with little investment. Let's just put a little money to see if it works. So a little money depends if you're a little guy or if you're a huge corporation. The number of zeros change there. But as you progress, you start investing more and more. Your launch is your big expense. These are not really to scale, but just to give you an idea. And as you grow, you want to keep investing, but investing is your money, okay? No one else's. And you still have to invest a little bit probably at this stage to finally pull those final cables. You probably still need a little bit of money. Are we still going good? Okay, so after your investment, what comes in the revenue? And there is nothing in the B&A because we have no product. But when you start launching, you start seeing something and this is where your revenue comes in. Your growth and your maturity, that's where you make your own money, okay? And you may still make some in retirement, but it's probably just not worth it. If you look at the revenue curve, the revenue starts tapering off. This is time, it's not worth it. We can do something else. So now let's put all of these together, okay? So let's put a revenue with our investment. And as you can see, in the beginning, it's my money being put in and at some point I don't need to input so much because the revenue is coming in. So money is still coming into the business just from two different ways. And now if we add into the next thing, which is the risk, as you can see, the relationship here is that I am adding more money in, I'm investing more because I see the risk is lower. I am confident it's worth pushing my investment. I know the risk is lower because I've researched my product. All of these things work together. And as I see my launch as well, this is where this is the critical part. Is my risk still getting down? Is my revenue going up? And is my optimism worth it? I need to really invest so I can get this set up. Again, like this is a little bit exaggerated. I'm a bit optimistic with that one. And this is why you need so much research here but also there. Because the research is gonna make your risk go down and therefore your confidence up. The research is gonna justify amping up the investment and the result of the investment and of understanding what you need to do means that your revenue can come. There's work that you don't see in there which is the actual work done with all this money. But this is how it fits in. But you need the research to know it. And I really believe that needs to be user research, product centric. Questions, concerns, metaphysical interrogations, deep doubt. We good? Okay. So this is why user research needs to be at the core. It feeds into the core product mechanisms. It is not design fluff. It is not pretty. But do we need to put it here or there? There's nothing, like it is hardcore business people. So you want to research more during greater product uncertainty. And because you want to reduce risk, you want to minimize your investment. Like I'm all for lazy business running. I don't want to put any more money than it needs to succeed. I don't want to put any more effort than it needs to succeed. Why should I do more if I can do less and get the same result? And you really want to protect your revenue. Very proactively. Now to get there, what you want to do is research more. I'm gonna bring this back to the agile cadence now. And research can get very complex and your key factors of complexity are here. The number of topics you need to research. So topics could be context, number of features, numbers of pages, number of customer groups, number of markets, your number of users met, and the number of research languages. So anyone who does multilingual projects, you need to do your research in each language because you have different cultural backgrounds. If you try to push your values onto another market, it's not gonna go well. How much do you like it when we just throw American products into India and just, there you go. This is how we do it in America. This is how we do it in Europe. And you're like, no, this is India, this is Japan, this is China. We do things differently. Our banks work differently. Our transport work differently. Our social media works differently. In China, they don't have WhatsApp. Good luck if your support is on WhatsApp. So you need to count those research complexity factors and to minimize your research complexity, very simple. You minimize the number of topics and you spread the number of users over time. So you eventually will meet with a hundred people, but instead of doing it in a week or two, it's very optimistic. Let's say three or four weeks, you're gonna do it in nine months. And that's because, remember, the gift of agile is this ability to give you more time because you spread the work. You consciously spread the work and use the time. Let's continue with that or not. Push the wrong button on the remote. That's not my bad. What we're really using is the wonderful power of agile to use simple scenarios. You scrub your rinse, you repeat. And the secret is in repeating it. I called washing dishes the gift that keeps on giving. You're never done. As soon as you're done, it starts appearing again. User research is exactly the same way, but it's in a good way because it keeps giving you good food, like clean plates are a blessing. So think of dishes, it's a necessary evil sometimes, but it helps you get on. Now, remember that research cycle, okay? To do just enough to deliver in this smart, focused, savvy mindset, your sharing in agile is gonna go a little bit further and it's gonna go a little bit further in including these two things because these people are gonna build and deliver. So once you understand that your research cycles goes all the way here, it has to fit into what's being built and what's being delivered, then you start shifting a little bit your research. What I'm saying is that it is not, you can have research endeavors that are big bulky projects that feed a lot of strategic insights, a lot of deep thought that help pilot big organization roadmaps and 20 year strategies. I'm not talking about this kind of research. The game is to deliver, the game is to ship. Let me take a sip of water here. And to do that, to fit into that cycle, you can use some research activities that fit, that work well when they're done short and repeat. So your ethnographic studies is literally go and see. So if you're redesigning, let's say, a ticket booth at the, sorry. A ticket for the public transport system, you wanna park yourself in a station and observe during a couple of days at rush hour, at quiet times, early in the morning when people are not awake, late at night when they're drunk, you wanna observe what's really going on. You wanna observe on a very busy travel day when people have lots of suitcases and kids and you wanna observe on a business day when people have a little briefcase or a backpack, what goes on. Contextual inquiries are discussions. You ask question, the person answer. It's a conversation, but it's a very, very managed conversation. Single journey testing, preset values, apologies, the comments missing. You wanna test, when you want to test actual pages or journeys to make it smaller, just test one, okay? Test one happy path, test one unhappy path, test one failure scenario, okay? You can also do card sorting and tree testing to assess your navigations. I'll tell you more about that, I've got more slides. To really just get to this just enough insights, technically you do it that, you shorten the session. We often say we need 60 to 90 minutes. I'm like, no, no, no, no, no. Give me at least 25 minutes. I can do it in 25, but I find a 25 to 45 minutes is actually also my peak time. I can focus 100% for that amount of time and really maximize it. My attention starts to wane after 45 minutes of, because remember, user research is doing many things. I am thinking of my research. I am thinking of my question. I'm listening to the person observing. I'm making notes. I am checking that my equipment is still recording, working, whatever it is, and I'm keeping an eye on time. The cognitive load is massive, so shorter sessions actually do bring better results. You wanna conduct longitudinal studies. Longitudinal studies is very simply meeting again the same people over time. So don't just meet the same group, but amongst the people you meet, go and see some of them on day one and then on week three and then on week nine so that they see different things and you start to learn about them as well. So incorporating longitudinal studies, which is usually not done in big user research programs because you just do one big chunk of research. You're not coming back to do the whole same endeavor massively three months later, but because we're doing it agile and we've got several months, we know we have been on projects. We had a year to delivery. There was a lot to deliver, but that means I knew I could come and meet those people several times over the course of one year. Ignore what works or works well enough. In big research, stakeholders usually want to be reassured that there are things that do work. In agile research, anything that works I pay no attention to whatsoever. It works, done, thank you, bye. I'm gonna focus on what doesn't because this is where we need to put our work. It's not a vanity endeavor. It's a very pragmatic technical endeavor. I iterate faulty design between sessions. In big research, formal research project where you wanna make sure you've got consistency, you show the same thing and you notice the same mistake as many times as you have sessions. In agile, I iterate. If I notice within two, three people that, yeah, that label really doesn't work. That button isn't contrasted enough. Whatever, I just fix it in my design because I know I'll fix it later. I no need to delay the pain. Let me learn something else. Once I've removed that problem, what other problem shows up? So some focus areas you can look at to be really, really narrow and to be really focused and to do it quickly and repeat your navigation, just the navigation. Too often we say we test the navigation but we look at the content of pages as well, just the nav. Names and labels, so copy terminology. Too often, especially with established businesses and established products, you start using your internal lingo and customers use something else. User research needs to go and get those labels, need to go and understand how people call these things so that they recognize them and they don't have to learn how you call that thing. So just check the names and the labels. Where to click to start and to complete a task? Is it obvious? Like micro, they seem like micro goals, but believe me, when you know that, you're ready one step further. Achieving a goal across a few pages so you minimize the extent of what you're testing. You can't have a whole 20 page click around. And finally, simply finding information on a page. So many of our user goals are very simple. Are my friends with so-and-so on the social network? How can I find that quickly? I've just typed a message. How do I send it? How do I attach a photo? How do I know they've read my message? Okay, I want to start checking my bank account. How do I log in? Very, very simple things. What's my bank balance? Do I have a clear, simple number that tells me if yes or no, I can go and buy myself a few dozen very colorful outfits at the mall next door? The answer is no, by the way. And some more tools and techniques, okay? Because what you have is in agile, some musts. And the tools and techniques needs to focus on the fact that you must research only what can be immediately used to either design, iterate, or build. For me, agile concentrates focus onto the immediate time, which to be honest, is the best place to be. If I research only what's relevant right now, that means I have those insights at the tip of my fingers when my QA tester asks me down the line and say, oh, but wouldn't they try and do this? And then I can say, I haven't seen it in research. Not saying they would never do it. We haven't seen it so far. Why don't we not build it until we see signs that we need to? And I design very much in a way of only build what we see is needed when the event happens because we're in an agile cadence. We know we can react. We have capacity to react. So I have confidence that we can do it later when it arises. And if it never happens, then we haven't done the work. So very much focus, which means that to take a step back, this kind of research is really not good for anything very strategic, planning, roadmaps, big endeavors, and business model definition. If you're mind the lean startup. So to establish your product market at a business side, rather than here are your users, this is what they do. Don't do these shorts that do more in depth, more be thorough, different approach, different goal. Other tool and technique to try to get on to just the right thing is your design needs to use practices like this, which is called atomic design. Talk to your designers. Any designers in the room? Yay, there's one, thank you. Atomic design helps you very much. By atomic design, you start having atoms. So your atom could be the style of your radio buttons, the style of your checkboxes, the style of your H1s. Your molecules are a group of things, a field and a label and a submit button. Your organisms start being how a whole thing behaves, and then the template, I see so many, templates used to be super hot a few years ago, like 10 years ago, and now they've completely gone out of fashion, but then we do have pages that always look the same. So bring back the templates and then you have your pages. Once you start using atomic design like this, you start being able to focus. Do I need to learn if my super funky, unique branded radio buttons work or not? Without having to test the whole page, with the content and the proposition and the goal and blah, blah, blah. Actually, do our radio buttons look like radio buttons? Are we complicating our users' life? The answer is often yes, by the way. Don't brand your radio buttons, just use the simple ones. Use your branding elsewhere. You need to know what to test as well. And very often I find that the really big research endeavors are simply because we don't know, we're not sure we're gonna throw money at it. I kinda like throwing money at things when I have money, but unfortunately, when you just throw randomly, you get random stuff back. To know what you test, use your product roadmap. And the product roadmap, not in the what we will do, but what do we need now? To get to this point here, we need to have this by now. To get to this outcome, we need to have built this feature. So a good product roadmap is very much going to help understand what you need to research and really, really have the courage to test scarily little. You have to be thinking, I can't really just go show them five radio buttons. Do go and show them five radio buttons. Just spend 20 minutes over two days with 10 people and just bring back those learnings and get rid of this problem forever. Because otherwise, they're like little plague that will just come and rot your design throughout and throughout. You have lots of little user research insights that start as little things you could have called right there and then. And then they rot and they get worse and then you can't tell anything from anything. It's this big mass of mold. How are we doing? So far so good, yeah? Okay, let me continue a little bit on that. Yes, first, my recommendation is if you cannot afford to research your users, you cannot afford to be in business. Because you will learn late that you don't understand these people. Research is the cost of being in business. Any company that doesn't understand that learns it later on. The really big companies, the Googles, the Facebooks, do you know how much money they spend on user research? Like they send people all over the world with full team translator, big guns. Because they know that if they mess on China, they've closed the market for five years. The risk to not doing it is so, they know the risk is they know how to do business. So you need to go and research your users or you need to be able to tell your CEO that that money went down the drain. Forget the big advertising marketing campaign. It's not going to sell if you have fundamental cultural problems in the way your proposition is laid out on your product. You can do a lot of research remotely. You can do a lot of Skype conversation for your conversation, for your contextual inquiry. You can do remote testing, you can do remote tree testing, but you absolutely have to go on the ground to discover context, to build rapport and to really make sure that you minimize risk. If your company's not ready to invest to minimize risk, I constantly say, okay, we ship, but there's a risk. I have no idea. There's no way I, for all my knowledge and all my smarts, there's no way that I can know how this product will behave in Brazil. I actually worked on a product where the PO had gone to check the country, came back and said, yeah, we need to do this. We need to do that. We do all that six months later. We arrive in Brazil with a really fancy product. Very nice. It was a prototype. It was pretty nice. The basis was it constantly called fresh data to display the freshest data. What was the first thing we learned when we arrived? They don't have enough internet for what they have already. That was six months of work down the drain. Well done, dude. So go and research. You need to invest. And you can go cheap, but if you cheap now, you will pay later. So take it out from the marketing budget, from the advertising budget, put it into the product. Let's continue a bit and ask me again if you want more. So you want to make your participants tell you what's going on, okay? Your best, if you come out of here with one research technique, it's this one. You make participants tell you what's really, really going on, okay? What you can use for that, one of my favorite technique is called clean language. Google it. There's a couple of books, Judy Reese. It's a way to ask questions that explores what the person means when they say what they do. Because we all apply metaphors when we speak. So when I say things are fine, my fine is not your fine. Fine in India is not fine in Britain. People in Britain tell you they're fine and they've just come back from the funeral of their parent. That is not my idea of fine. So use clean language to explore what they really mean. Yeah, that was easy. What kind of easy? I'm like, you know, I'm expecting it to take time and be complicated. So, you know, it was as I expected, taking time and complicated, but it wasn't as bad as I thought it could be. Okay, that is not my definition of easy. Another technique, the classic use open questions, and what I should have added on the slide, listen to the answer. And again, it is incredibly difficult to do that when you research because you're both playing in, you know, you're focusing on your questions and you're observing things and you're looking to see if your prototype is behaving and is my camera recording and how much time do I have and oh my God, I need to pee, but I still need to listen. If anything, I need to listen. And remember these things like we always forget to but do explore, frequency, context, spatiality, triggers, blockers, collateral activities, outputs and outcomes. I know it's a big list. I could do you an entire workshop on this, but we often forget to ask how often or when do you need to do that? Context, where are you? What are you doing? What kind of device are you on? What kind of network are you on? What kind of, what else do you have? Spatiality, like are you sitting down? Are you comfortably in a bus? Are you walking around? Are you in a safe environment? Are you in an unsafe environment? You know, if you're designing a map system, there are some places you don't wanna take out your fancy phone because you're gonna get robbed. Triggers, what made you do this thing? And it's not just any, oh, I need to find my way. What made you whip out the phone right now? It's like, well, I kind of memorized the map and I no longer recognize where I am. Anything blockers, collect, et cetera. So all the fundamentals of research. Now let's go into a few techniques you can apply all this into. In particular, what I find very often, and that's because I'm a UX and information architect, navigations are often the worst part of many products. They are the ones that show me that the business doesn't understand the customers. They don't understand what I need to use your product. So things that f me off are finding how to cancel the subscription. Sometimes it's dark design, it's deliberately hidden. And sometimes it didn't think of putting it in an obvious place. Things that map organizational diagrams, you know, so this department takes care of this so they have this side of the website and that, yeah, but my activity is over the two, so where do I go? So how to build a navigation agile can really help. You want to design and build iteratively, okay? And you've got two different cases. New products from the ground up or mature products you're trying to fix, improve, maintain, keep a life some way. For new products, only show what has been built. Do not try to be crystal gazing into the future and only test what you have built so far or where you're about to build. And iterate the entire knife as you're built more. Do not be afraid of iterating the entire now. And I'll tell you what you shouldn't be afraid is like there is a world leading company that has been doing this all along and it's Amazon. This was Amazon's nav in 1998. Two years later only, that was their nav and they kept iterating it. You can find this. Look, Roblesky took all those screenshots because when they went live in 98, they had two things. They had books and music. So in their nav, happy campers, books and music. And as it went on, they had the stuff. Video and GIFs came in and then they had to removing the search around. Then they started having e-cards and auctions and then they starting having toys and games and more and more and then the famous double nav that we all remember as being one of the most horrific thing that was ever designed. But they did not shy of only, instead of thinking, oh, in two years time, we'll be selling all this. So I need to have placeholder. Then I said, I keep people, I try. Yeah, that's what we have. Navigation needs to take you someplace. Navigation is not a marketing tool. It's not an advertising tool. It's a direction finding tool. So don't mess around. Don't tell me the toilets are this way when they're that way. And how do you test a navigation to make sure what it is? Sorry. That was the tip for how to build a navigation for new products, okay? Feature only what has been built, iterate the entire now. So test as you go. Where would you go to find a book? Where would you go to find this? You're trying to buy so and so. How do you do it? If you have an established product, what you need to do is rework from much more data, much more categories. Use cart sorting. And this is an example of cart sorting. This participant was asked to cart sort these products, the hygiene products, and to give every category a label. So they put together, and that's their mental model, toothpaste, children's toothpaste, dental floss, and mouthwash. And they said, oh, this is dental. And then they put together shampoo, hair conditioner, hair gel, body wash, and bar of soap. And they said, that's for the shower. I find it fascinating, because I don't use hair gel in the shower, but obviously this person does. So this is great, we didn't know this. We need to know if more people do that so that when we put our hair gel somewhere, we're placing the right thing. And then they put lotions together, lip balms can lotion, body lotion, and hand cream. Once you know the categories you're working with, cart sorting helps you a lot, can be done remotely very easily. What's interesting is a lot of cart sorting is done remote, unmoderated. So you just send a test, people take it when they can. What's great is doing it moderated. So you can ask people, oh, that's really interesting. Tell me more, why is the hair gel in the shower? Is there a, you know, am I discovering something? Is this a tip of the iceberg? So you need the conversation, which unmoderated doesn't give you. And the other way you can test for your established product, what to do is you use tree testing. That one can be done quite simply. Tree testing, you validate that people can move through the categories. So you've got a lot of automated tests. I think this one is Treejack. Task one of 10, you want to start your own business. You're not sure what paperwork you need to fill out. How would you find out? You've got the banana.com homepage, individuals and families, business and employers, nonprofit organization. We really want people to click here, but let's find out where they click. They might say, well, I am still an individual, I'm still not a business. So maybe I start in individual because today I'm not yet a business. So maybe it needs a fourth category when we start exploring a bit more of what goes on. We need to say, ha, I want to start a business. Cart sorting, tree testing, growing your nav. How's that going? Is that of any use to you so far? Are you seeing your problems, your concerns useful? Okay. Not useful at all, I need something else. I'm here for you. Then what you need to do the next thing or the other thing you can do is always test with realistic content. Two things for this, one, when you'd use unrealistic content, you get bitten as soon as the real one comes in. Here's an example. I hope you see it well enough. I don't have an image. I have Laura Mipsum. And then when we do the real content, well, I can't really see my cake. Like, why is there so much black and white? The label doesn't fit. I've got way more copy and I've got more to say and I don't have any room. So don't learn later that your design doesn't work. Also don't learn later that the process by which you get content doesn't work. Because that's more often the biggest problem. If you, copywriting needs to sometimes go through other channels, it needs to go through marketing, it needs to go through legal. These guys have no idea what agility is. Three weeks for them is short. So if your pipeline problem, copy pipeline is a problem, you want to learn this as quickly as possible. You don't want to delay your go lives because copy's not there. So get as soon as possible as realistic or as best copy as you can to make sure that everything works together. We're humans, we need context. And do test before you have all the pages. And one way of doing that is you often move through pages using Calls to Actions. You can stop at the Call to Action. Very often we think, no, I need to test all the way. So I need to have the paid that the person goes to when they've clicked the button. No, no, no. Go just to the button. You have a form. Let's say I need a new card, I'd like to get a loan for this, make them fill the form and just click on get your quote. Because what we want to check at this point would be, do they have the information we wanted at the tip of their fingers? Do the way we ask the questions make sense to them? And everything needs to go. How the information comes back, how they understand our quote or our offer is a completely different thing. So I can stop myself at that, get your quote button and I can get the user to press it, whether it's a paper prototype or a digital prototype or even the build, if I test in a test environment. And nothing happens. They say, oh, the button's broken. Oh, yes it is. What do you expect to see? And then they tell you what they expect. They said, well, I really want a big number. I want to know if I'm eligible. I want to know. I want to know. And you learn all that. So test before having all the pages is a good secret. To really get to your Agile user research mastery, your goal is to have only stories that are ready for this for imminent development. The focus, the time focus is squeezed in. Contrary to the big tradition of design and research, which is we have time, we explore, we make sure we have the right thing before we go to build. Shift your mindset as a designer and as a product owner to like, just find me what we need right now. And I will give you more time or I will take more time later if we need more when we need it. Which means that there needs to be an agreement that there can be a design debt. I've introduced a concept of designer research debt with my teams to say, just so you know, we're pushing this page now, but we only know about half of it. So guys, expect to see it again in tickets later on in another form. I'm not gonna make you build the same one over again, but we will iterate on it. And I can't hear any groveling that we've already worked on it. We're Agile. We iterate as we learn. But I need sometimes to remind the developers of this, oh, I've already built this page five times. Why don't you get it right the first time? Because to get it right the first time is impossible. To get it right the first time would have been so expensive. We're learning how to do this. And because you've done it five times, hopefully you've also cleaned a lot of stuff on it. Might as well. And so to get your stories ready or your insights or your design, I'm using stories in the greatest term, for imminent development, what you're willing to do is two things. If you work in scrum teams, for example, the scrum team using a two-week sprint, I can do this, three days design, three days research, three days design. We all know the 10th day meetings, well, I'm just tidying up. Prepare a bit, research. Do my analysis, do my thinking, do my sharing as I do the research. So at the end of the day, I call the team, I share in the morning, stand up. I show what goes on and then three days integrating the tweaks, building the elements that I need to give to the dev so they can do it. I need a bit of time on both sides. When I say three days research, it's mostly running the research, analyzing and sharing. The prepping part where I know where to go, I may need to book train tickets, I may need to book people, that's the hard part because you may need to plan ahead. So what happens in user research, in agile user research, and I don't have a slide on this, is you may need to book people without knowing what you will show them. Just know that on the 17th, you'll be in Mumbai and you've got 20 relevant users booked and by the 17th, you will have something to show them. Because you're agile, you will have something. I just don't know what. But I've very often done that as book research sessions. I know I have the right users in a room for 25 minutes. That's all I need. I will always have something to ask them. There's always something to learn. I will always have something to show them. But I will probably know what I need to show or test just before that date because we are in just what I need right now. So do very short loops and sometimes I've even done several micro design research cycles. I got very lucky developing a product, an internal product for a company and my users were literally in the building next door as well as in other cities and countries. But I knew I could jump on the train and in an hour I would meet users. And I also knew I had permission to go and interrupt them to very important criteria. So I could absolutely find myself when they say, actually, I don't know if they really need this. And cross the road, go talk to the people like, Jeff, five minutes. Can I show you this? Can you tell me about that? Have five conversations in two hours and come back with the insight and amend my design. This is super powerful. It demands that I'm reactive. It demands that I have buy-in from the business. But do you realize the value that this approach has rather than saying, oh, my next research session is in three weeks. I'm gonna add the question to the long list of questions that then I don't know how to make sense of. Anybody using Kanban? So if you use Kanban, you need conversation. You need to negotiate with your developers on velocity because I know how long it's gonna take me to find something. I need to know if you can't wait, keep yourselves busy, do other things while I do my stuff. Every now and again, and even in Scrum, we've hit on big design snags when I'm like, guys, I'm so sorry, I'm gonna need some time to figure out how to do this one. I know we're planning on doing this functionality in the sprint. We're gonna have to pull something up because I don't want you to waste your time. So in Kanban, you need to very much negotiate your velocity and theirs to get there. How are we doing? Two more minutes? Yeah, we're fine. Oh, we're over time, sorry. So that was it, actually. I'm gonna stick around. The session is officially over, isn't it? We have 15 minutes for room change, but I'm gonna take your questions now. Apologies for going over time, but I'm a bit excited here. But thank you very much, and please come and stick with your questions, or if we just wanna abuse the time, just come and stay. Thank you. Yeah, if you wanna fire away. This one. How do I size my design stories? I've got over 10 years experience designing, so I know how long is a piece of string. So if you, and I also know, and that's from experience, what I can get away with not doing. So I negotiate what I'm gonna build, and I also know from my designers, sorry, from my developers, I know what's a big effort, what's a big technical effort, and what's a small technical effort. So we have lots of discussions on, is this something that needs a lot of design, but very little dev? And that would be, I'll tell you, the typical thing that is a lot of design and very little dev is forms, because we need to get the content right, and we need to get the way we ask the questions right, so we get the good answers so we can execute things. But the build is super easy. Something that is super easy design and super complex build, single sign on. Design, catching, build. You know the pain. And then we have things that are about of equal weight, so I need to negotiate always, depending on the technology, the maturity of the team, we negotiate these things. So there's a lot of discussing how much work is needed to get somewhere. So we do that as part of spring planning. I very often work in Scrum.