 Okay, all right. So how has it been till now? Good, right. How many of you were here yesterday as well? Ah, that's nice. So what I'm going to do in the next 20 minutes is I'm going to present an experience report about improving your customer feedback loops. Considering that customer feedback loops is something that we all want to do when we are in agile. We want to get feedback as early as possible so that we can improve our systems and deliver better. Before I go into all of that, I want to ask, how many of you believe, and then I say believe, like firmly believe, that agile will never lead to failed projects? Considering that no one raised your hands, I can safely assume that all of you have failed once at once, right? Before we go ahead, if any one of you can share one of your stories in brief, what was the failure and why did it happen? Have you guys ever read respect to that? Any short story. Just to make it interesting, I want you guys to speak. The one who speaks gets a goodie bag. And there's a goodie in the goodie bag, right? So we'll keep it interactive and I'll make you guys speak in this session. So anyone who can tell me if one of your agile projects have failed for any reason. Good, sir. Your name please. Gaurav, yeah. Okay. You're not aware of the scope of the story. You still took it in your sprint. Some scrum people will really kill you for that. But yeah, that's a valid scenario. And thank you for speaking. I actually have something for you. Please, sir, if you can come here. Quick run, run, run. Come on. Thank you. So you guys now understand the drill. You have to be vocal in this session. They're not a lot of people, so your chances of winning is more. Everyone gets a different thing, so you have to try your luck. Okay, so here I am. I'm going to talk about one of my projects that failed a few years back and the reason why it failed. So let's go through a flashback. I was working on something that I would call the Manga project. I won't say the name of the project. It's quite public. How do we categorize the project failure? It can be anything from something that we promised and couldn't achieve. Something that we did and we expected a huge return on investment on that, but that did not happen. Maybe we ran out of budget and we couldn't complete the project. So any of these reasons can actually mean it's a failed project. And obviously, in most of the big organizations, we hate to say that a project failed. We hate to accept mistakes. But every now and then, we do come to know or we do realize that something has gone wrong and we have to fix it. So one of my projects failed and we actually did a very nice root cause analysis to understand why it failed. And this was the reason. We, our definition of failure was that we did not do what the customers wanted the software to do. Does that sound familiar? The software is not doing what the customer expects it to do. Why is that? How many of you attended Richard's keynote yesterday? Richard's keynote? Oh, good. He made a statement, we build for stupid users. How many of you believe in that statement? If there's one person believes in that, that's good. Your users are not in the room, so you can raise your hands if you believe in that. But here's the deal. Users are not necessarily software people. Users are not people who are from software industry. For them, they are just regular people who are expecting that you will create a user, sorry, you'll create a software system that for the user will help them to do their regular job. That's right. So consider if you're a data entry operator and you're sitting in a room full of IT people and the IT person is demonstrating the project or for whatever they have developed for you. As a data entry operator, will you really care? Most of us won't. Again, to quote from one of the presentations we had yesterday, I think Sanjeev spoke about it, that managers are people who should basically get pizzas for you. But usually if you're sitting in a meeting room and your manager is present, even if you have like a fantastic idea or if you want to provide feedback, you may not do it. Why? Because your manager is there and they are the decision makers for all of your software decisions or your business decisions, everything. And as software programmers or even as a data entry operator, we don't care about managerial decisions at all. I really don't care what strategies you are creating. All I care about is that my programs are being built and they are being deployed. How many of you agree with that statement? How many programmers in this room? How many of you actually talk to your managers and ask them about the organization's vision? I was expecting at least one hand. But that's the truth. At the ground level, we don't believe in that. And that's what happens in many of the cases. And that's what happens with many of the people who are not even from software background, our customers, majorly. So the root cause analysis basically gave us three astonishing observations. And one of them was this. Humans always complain first. They appreciate later. Sometimes never. What I want you guys to do is think about the last feedback that you gave to someone. Just think about it. Now raise your hands if that was a positive feedback. Two hands, three hands, four hands. That's less than 10% in the room. Is that true? If a software works like how it's supposed to work, that's not a positive feedback. That is what was expected. The customer was paying you for doing that. Isn't that true? You did something that you were paid for. I don't care to appreciate you. Why not? So that's the reality. And if you are a customer and if you're attending a demonstration, say for example of a product that is supposed to be released in the future, during the demonstrations, you will not care. Because most of us give demonstrations of other things that work. We don't demonstrate the things that are not working. How many of you demonstrate the things that are not working? None of us. Good. So yes, but when this goes into production and something fails, your users are not going to shut up. They're going to kill you. They're possibly going to sue you for all the money that they spent. Right? That's the first observation. Second, we discussed a lot about this in today's session as well. And also yesterday, I will quote Craig on this. How many of you attended Craig's session yesterday evening? And he said, make room for introverts. Okay. Introvert users need time to take their decision. They need time to think and stimulate themselves before they can actually come up with something. If you are expecting that your users are up on their feet all the time and providing you feedback for all the demonstrations, that's not going to happen. But these are the same users who are going to use your software systems when it is released into production. And if something doesn't work, they're going to get production defects. Has anyone released a project over here into production without any production defects? I was again expecting at least one hand. But yeah, that rarely ever happens. Maybe it was not a production defect. It may just be that the user does not understand or the system is not intuitive enough how to use it. That's possible. I don't understand how to do this. This is my regular job. They won't care to learn the system. They will first raise a problem. That's an introvert user, not to blame them. But yeah, it happens. Third observation, most important, if you don't receive a feedback, does that mean that everything is good? Does that necessarily mean that everything is good? Obviously, when I asked you guys that the last feedback that you gave and whether that feedback was good or bad, yes, most of you gave a bad feedback, but if no one gives the feedback, does that mean that everything is working properly? It does not. Sorry? You tend to assume that. Yes, we tend to assume that no one said anything. Like after this session, if you guys don't come up to me and say, Vishal, it was a good session, I will assume that it was a good session. Right? They're effectively not using it. That's true. Yes. I mean, see, okay, we are discussing a lot of what apps are installing, uninstalling it. I work with enterprise applications. How many of you work in enterprise applications? Anyone? Okay, so has it ever happened that you tried to solve a problem? Okay, something that people have been facing for quite a long time. You provided them the software. After a few months, you go and speak with them and they say, yeah, it's not working and we are actually doing the old way and we are not shifting to your system. Has that ever happened? Yeah, it's happened. So sometimes you don't even have an option to uninstall the app. You're just stuck with it. It's not working. We'll work out something else to do it because we still have to earn money for our company. We don't care. So it happens. But if you don't receive any feedback, isn't it your responsibility to go and fetch it? Do we do it? It's our responsibility. We don't do it. None of us are Spider-Man over here. That's good. So let's see how we can fix it. Now there are three fixes that I have done in the past. It has worked for my teams, a few of the teams, not with all of the teams. I'm not sure if it will work with you, but again, if you can go and experiment and if it works with you, please call me back and tell me how it worked out. The very first thing that helps a few teams is when I say this statement, demonstration is for the software team. What does this mean? What happens if something breaks in production or rather let's take a step back. A programmer delivers something on integration or on acceptance and a QA finds a bug. Who's the most irritated at this time? The programmer. Now let's go a step further. It goes into production and something breaks. Who's the most frustrated at this time? It's the QA. The user is frustrated. Yes, of course. But the QA is frustrated even more. A few days back, I played a game with one of my teams and they were supposed to do a random activity and every time they failed, we used to restart. And there was apparently one QA in my team and he was continuously making the same mistake again and again and after the third time he got frustrated and left the room and we had to actually get him back inside the room because that's the truth. When people fail, the people who are using the software obviously they will shout but the people who built it, they are sad. We don't want sad people in the company. So when you say that you should prepare for the demonstration it simply means that you as a team when you're demonstrating to your customers you prepare your questions first. You need to fetch feedback from your customers. So write the questions that you want to ask your customers specifically and especially the people who do not talk in the meetings you need to actually ask them, hey, do you think this piece of code will solve your software or your business problems? If you're a data entry operator, do you think that you will be able to do like 1,000 data entry operations after you have the software in one day? These are questions that you can go and fetch. Yes, sir. Sir, I'll interrupt you over there. Your name, please. Bianca. Bianca, I'm going to give you one goodie back because that's how it works. Yes, please. The reason why I'm actually appreciating your question is because that's the next slide that I have. Okay? How many of you get real users for your demonstrations? Real users, people who are going to be hands-on on the system before and after. Yeah, yeah. Most of the people don't. What you get in the room during demonstration is user representatives. Usually they're managers, okay? Managers are sitting in the room and they are trying to understand what your software is going to do and the manager is thinking at the back of their head if the people can actually use this. Instead of going back to the people and asking them to actually use it. That's where we fail a number of times. And that's the next slide that I have. Usability and user testing. How many of you do usability testing? And not the A-B testing. I'm talking about usability testing with actual users. Two, that's good. There's a nice book by Steve Krug. I say this in many of my talks. It's called Rocket Surgery Made Easy. It's a very thin book. It's a nice read. You can actually go and read it. It talks about a very simple way of doing usability testing for any software teams that actually helps you to improve your systems. How usability testing works is you get a few users in your office and you present them with the actual system. Now these are real-time end users. And you provide them a few tasks. You just sit there silently with them in one room. I'm going back to the point where we said that people who are introverts, they need their space. You silently sit in that room, just provide them the tasks and ask them to do it on their own. If these users are able to perform all of those tasks, which is their task, the thing that they are going to do when the system goes live, they are able to do it on your system, that's a good feedback for you. That means that you have a good chance that when the system is rolled out, the system will be intuitive enough so that it can be used by everyone. At the same time, if you club user testing, so the same user is doing the same tasks, if something fails, or if the user says that what has happened right now is not what I expected, that's a feedback for you to change. Do you club the usability and user testing aspect? Yes, sir. I'm not giving you more good advice. We'll discuss about the UAT phase later if you don't mind. We can actually do one-to-one discussion. I have different aspects in my mind when it comes to UAT, so you won't get into that. Yeah, that's fine. You don't need to. Okay, guys, we'll discuss this. That's the reason why I specifically mentioned Steve Prug's book, Rocket Surgery Made Easy. It's not about testing your software during UAT, it's more about testing your software every one-to-two iterations, or one-to-two sprints, how we call it. So yeah, testing with real users gives you real feedback. That is what is important. And then finally you have something that we usually do in the end, end of every demonstration, that is. Do an anonymous feedback, okay? And when I say anonymous feedback, it's not like giving a survey to your attendees and asking them to fill it and give it back to you. No, you don't let them leave the room before they're completed. You don't do that. You give them a form, have a mix of open and closed-ended questions, especially to the features that have been demonstrated, let them reply. You don't let them leave the room unless they have done it. They are implicitly going and getting everything out from them. So these are three practices that I have done on multiple projects. A few of them have helped me. A few of them in many cases have not. That's just the experimentation aspect, okay? That's all that I had from my short experience report. Thank you. And you can get in touch with me on vishalprasad.in. And I still have two more goody bags to give to people who ask me questions. Yes, sir. Did everyone get his question? Anyone who did not? Okay, the question was that since we now work in virtual teams and the users are basically remote compared to the place where the software is being developed, it really becomes difficult to get the key users in the room during demonstration or actually working out things with them. Yes, it is difficult. I'm not going to lie about that. But again, there are methods that you can do. You can have things like remote usability testing. You can have people who are located where the customers are when you are doing these kinds of testing. There is an amount of investment that is required in order to do all these things, time and money. It's not impossible. It all depends upon if your situation allows you to do that. And that's why if I talk about all the three things that you can do about preparing upfront or doing usability testing or getting user feedback on surveys. Even the feedback that you get from surveys. If your users are sitting remotely and you want to do an online survey and you don't want them to leave the room, what do you do? There are challenges with everything. And you will have to experiment a little about going around these challenges. But yeah, if you do it, it can help you. I mean, yeah, you'll have to work out. Every project has its own challenges. So you'll have to work out around that. Okay? Yes, sorry. Yeah, sure. So if you... Well, first of all, it should not be long. I would not go more than say one A4 size paper that should be good enough to get a lot of things. What you need to keep in mind is that the open and closed end questions that you have that should be around the features that have been demonstrated and not around the people who have developed it. Sometimes you make mistakes because sometimes it happens that during the demonstrations something was not what is expected. Right? And we know who demonstrated or who built the system. But you need to make sure that the questions are framed such that the feedback is coming for the feature not for the people who have made it. So that's something that you really need to take care of. If you want some examples, if you have, say, a login screen and nowadays on many of the websites you have a login screen that opens in a modal window. Let's take a simple example. If there are people who are not comfortable with that modal window proper you can have questions around would you consider a different design for your login screen as compared to what has been shown during the demonstration? And that can be a simple yes or no. That's okay. But that at least gives you a consideration that do you want to go and revisit your UI designs so that the users are more comfortable using the system? So have the questions designed around the system and not around the team that is good enough in that aspect and make sure that the questions are what has been demonstrated. It should not be something that has been done before. It's during that one hour period that you have been demonstrating so it's fresh in your mind and people can give you feedback. Hope that answers. For that you get a good feedback. Sorry, I'll take it. In line with the vision of the project. That's true. And that's a very good point. How many of us do a visioning exercise before starting a project? Some sort of say an elevator pitch which has the vision of the project in there. If you don't and rather you should start doing it every time something gets delivered you can always go back and check if this piece of software aligns with the vision of the project and if it doesn't then there's something wrong. That's a very good point, sir. Thank you very much. Yes, sir? Yeah, absolutely. Absolutely. Every feedback that you get you need not incorporate everything. You need to make sure that it is adding some value to your project. If it doesn't add any value to your product that feedback is not necessarily something that you should do. Even with usability testing. If you do usability testing with three users and say all the three users report the same problem then that is something that you should address. But out of ten people if only one person addresses the problem that's okay, you can live with it. You don't need to go and make everyone happy. Yeah, that's a good question. Yes, psychological thought process. Wow. I really can't answer psychological thought process. I can give you a goodie bag. I'm really sorry. I cannot talk about the psychological aspect. These are just experiments that I have tried and it has worked. I have no idea why it has worked but I'm more of a data oriented person. If I see that data is working I basically go with that. The psychological aspect may be that if you actually ask the user for their opinion and I'm saying maybe, okay, I'm not a psychologist if you ask people for their opinion they are usually happy to give it. We really don't hold back opinions. We love to do it all the time. So the world is not perfect. You cannot read everything. If you keep on asking someone for feedback but if they are just not doing it you can't do anything about it. That's the reason why projects fail. Yes. Okay, we just have two minutes. I'll take two questions. That's what the gentleman at the back said that usability testing is not something for which you have to wait till the end. It can be done as soon as if you have a design on the paper. Obviously, I clubbed it with user testing and that is user testing of the system for which you'll have to wait at least for the prototype but usability testing is something that you can do way before. You don't have to wait late in the project. One minute, one last question. I'll give a political answer for that. Agile cannot fix everything. And that's the proof. A lot of times it happens that if something goes wrong in the organization, people blame Agile, sometimes the things go right, people blame Agile, which is not the case all the time. There are certain things that you will have to fix before Agile can at least do something good for your organization. I would consider Agile to be like infinity. Once you have achieved at least the basic practice, anything that you add on top of it that helps you is just going to make you more Agile. But before you achieve that, if there are things to be fixed, don't expect Agile to do that. Organizations can't even adopt scrum overnight. It's not possible. Or any other Agile practice for that matter. Things take time. Agile cannot fix everything. That's all the time I had. Thank you for attending this session. Thank you. And we have some more goodie bags for you at the Crest booth. So please do visit it. Yes, after lunch. Enjoy your lunch. Thank you.