 Glad to see everybody here. We're going to use today's mural. I'm going to talk about that a little bit later for those that are unfamiliar with it. And today we're going to talk about maximizing your product success through root cause analysis. And we're going to see some real live examples and a framework that we use internally a glass box to uncover those and help our clients uncover the root cause of their problems preventing users from converting and affecting their experience throughout their website or product. So welcome, everybody. My name is Isaac Mardin. I'm a senior product manager glass box. With me here is James Harbour. Say hello, James. Hey, nice to meet you. He's a co-product manager working with me. We're happy to have everyone here. So first a few words about glass box. So glass box is a one-stop shop solution. And we make your customers' digital experience better. Well, how do we do this? We help you capture all your customers' digital journey on Web and mobile. We help you visualize it in an easy and intuitive friendly UI in our platform. And we help you analyze using AI to identify customer struggles and we recommend appropriate actions. Also, we can help you elevate your business, increase engagement, increase revenue, increase conversions, and make your customers happy. Now, we do this because we have a very vast and robust suite of products. Everything from session replay to product analytics, to heat map tools, and journey mapping. Today, we're going to use Miro. For those that aren't familiar with this, it will help us collaborate and so we can ask questions and you can answer them and vote. However, if you have any questions, feel free to just simply unmute yourself and go ahead and ask, though we will have a Q&A at the end. There are two. The link from Miro should be in the chat right now for everyone to join. I'm just gonna wait here until I see some people joining. There are two important tools for you to use. First of all, the hand tool. This will help you move around and follow the discussion with us. So we do have a feature that enables me just to bring everyone to me where I am right now. And there's the arrow tool that will help you drag objects. So you will need that to move dots and post the notes around. On the left side of the Miro, you'll see the arrow here and you can click on it to change from the hand or the arrow or you can simply click on V on your keyboard and it will change. I also use a regular old-fashioned mouse and not a track pad. So right click will also help you move around and left click is the one I use to track. So feel free to use those if that's more comfortable for you. I'm just gonna wait until I see that we have roughly the same numbers on Zoom as well as we do on Miro. You have the link in the chat for anyone that's just joined recently. And as mentioned before, we do encourage you to unmute yourself and ask any questions if you have along the way. Let me bring everyone to me. I like this Miro feature just brings everyone to me. That's- Control. All right, so I think we've stabilized. So let's get started. So first an easy one, let's get to know each other a little bit. I want you all to come here. I'm gonna bring you all to me as well. And let us know what is your role. So just simply drag a dot to whichever role best fits your current position. You can also double click on any of the post-it notes and simply add your own if it's not listed here. You wanna get to know you. Now already a lot of PMs are expected. Ooh, let's bring all the courses around. That's awesome. Yeah, so we can see a lot of PMs, just a session focus for product managers, product owners, UX designers, even product marketing. So that's expected. But we're happy to see a business analyst as well. I know the spine part, man, did you see it? Someone's painting. I don't know how. Is that, yeah, someone's drawing on Zoom? Is there words to clear those annotations? Yeah, clear everything. There it is. Oh, someone, I think it's one of us that actually did that. So that's great. You can just, I think whoever is drawing there, so just go to Zoom, you're on annotation and click on the mouse icon and the annotation controls, and then you'll be able to drag the drop and the dot to whatever, what you need. Yeah, someone is still in annotation stage. All right, I think we got most of the roles here, so it's very nice to meet everybody today. And let's talk a little bit about your challenges. So what are your main challenges in your current role? As before, feel free to take a dot to as many that are relevant to you, and you can add your own if it's not listed here. You can try it on, yes. Yeah, low conversion is always a high one. Yeah, low conversion. It's a quality and accuracy product adoption, yeah. It's a good spread. Driving product adoption is also an important one. You release a feature, you want to make sure it's being used. But we're going to cover low conversion and product adoption today. So I'm glad that this is something that resonates with a lot of people here today. There's a lot on driving. So again, whoever's in annotation mode, simply go to the Zoom controls and click back on the mouse and the annotation controls, and you'll be able to vote and not paint on the board. But it's fine if you connect to your artistic side. Yeah, so we see getting everyone on the same page, creating a long-term strategy, driving adoption, gathering and analyzing user feedback effectively. I can definitely resonate with all of those, giving that we have a lot of product managers here today. That's expected. Collaborate with cross-functional teams. Another one is next. Another column on TV. Getting everyone on board. That's important. So for those that just joined us, see we have more participants. We have the Miro link in the chat and the Zoom chat. So feel free to click on that and join us. If you haven't joined it until now, see someone organizing the dots. I like that. Yeah, I noticed. Yeah. It's nice. OK, so that's a solid of challenges that are very familiar to us. Let's talk a little bit about our agenda today. So today we're going to talk about uncovering the power of root cause analysis. We're going to look on some strategies, tools, and methods that we can use for that. We're going to look at two case studies, real case studies that we encountered, one for a client of ours, and one internal that we actually solved ourselves. And then we're going to have a few minutes for Q&A. And at the end, we're going to have an option to feedback this session. So I hope you all enjoy today. So to continue on with our agenda questions, I'm happy to understand which analytics tools or techniques you currently use to gain insights and to use a behavior and product performance. Again, you can see below there. This will help us make sure that everyone are familiar with the various tools that we have to use to uncover the root cause of certain UX problems. There are heat maps, session recordings, surveys, user journey mapping, just to the one that you currently use. So we know what is everyone familiar with and what is the most popular ones. And if we see there's something that nobody uses, we can maybe interest that and elaborate on it. And also you have empty, white, empty sticky notes so you can just add another one if it's not listed here by chance. And again, as you've mentioned before, you can choose multiple things. If you use all of these, choose all, choose as many as you like. Yeah, exactly. Thank you. Huge amount of things in surveys, VOC. More than I expected to be. Yeah, a lot more than I expected. A lot more than I expected. I'm happy to see a lot of people getting real user feedback with VOC, with surveys, user interviews. That's much more than I anticipated. This is rare, so it's nice to see it. It's nice as well because later in the case studies we're looking at some of the other methods as well, so we start to look at the size, which is great. That's why we've got some of that. Again, I see someone just joined. So we're using Mirol for this session today. For those that just joined in the Zoom chat, there's a link to join Mirol. So feel free to click on that and join us. But I think we got a nice spread. I see two that haven't been clicked, three actually. So HeathMaps, that you visualize the product page, the website page, and see where users are interacting most, where their focus is, where they're clicking the most, and Funnals helps you understand users where the drop off occurs. Adult crafts and queries just that you understand query the data and get, for example, conversion rate over time or counts or analyze segments more thoroughly. So those are product analytic solutions that are very common. Glassbox, fun fact, we have practically everything here on the list. All right, so I see no more voting. So let's take you and get on with the workshop. So today, we're going to talk about uncovering the power of root cause analysis. And first, let's see why root cause analysis matters. So with root cause analysis, we can get to the heart of the problem and see what really is causing the bad user experience. This will help us empathize with our users because we understand where their pain is coming from. A lot of times when we see a drop off at a certain point, it's not really because it happened there. It's not really because they didn't click the button. And there's an underlying cause for this. So understanding where their pain point comes from can help us empathize with them and solve the real problem at hand. It helps us also treat the issue and not the symptoms. This is something product managers handle a lot, causation versus correlation. So you can treat the symptoms a lot, but it still might not solve the issue that you're trying to solve that can be underlying and it needs a little bit of investigation to get. Ultimately, it helps us to get real product success and increase customer satisfaction and make everybody happy. But there are some challenges in root cause analysis. It's not as easy and the data is not always on the surface and easy to access. And there are three big challenges that we saw in root cause analysis. So first is limited data and lack of access to comprehensive user data. Some tools are not, doesn't have all the data you need or sometimes there's not enough mapping in the product itself to get and get all the information and KPIs you need to understand and create funnels. Some issues are very complex and have multiple contributing factors. So to pinpoint exactly what is the main root cause can also be a problem without the right tools and resources for that. And this leads me to the last one. So a lot of times it takes time to investigate time and budget and teams and people to actually do this. And as product managers, we're very busy at it is and we have a lot of other responsibilities and priorities. So resources is another big, big challenge. And I'd like to hear from everyone if this is something that you resonate with is are these the challenges that you and me encounter in your day-to-day of identifying the root cause? And if not, what are other challenges that you see uncovering the root cause currently in your day-to-day work? And again, you can select as many of you think is relevant to you. I saw someone else just joined the Zoom meeting. So feel free to join us in Miro with the link in the chat. Yeah, limited data automatically. Immediately it's a big one. Yeah. Not having access to the right tools, not being able to understand user journeys. That's very important. Limited user sample size, that's another one. Some especially in B2B companies and smaller startups. So you don't have enough data as the large B2C companies that get a million hits in a week. That's hard to pinpoint actually where the problem comes from. I'm seeing a trend though that there, not a lot of people are adding any custom sticky notes. So at least I feel good by the fact that we managed to pinpoint challenges and across these questions. Yeah, I think there was certainly one added to each of these. There's one here, different problem maps, understanding the problem takes time. Limited data, sometimes you find that as a correlation with access to the right tools. You don't have access to the right tools, but if you did, you would have the data, not data necessarily in the tools you're using, not having access to the tool in the first place. I'm glad to also see complex issues with multiple contributing factors. So I'm gonna address that in the best practices section. We have a slide for that. There's an important best practice for that. I'm gonna, so I'm glad to see that as well. Oh, I see one circle is hidden. But if you change something and move that around, you can just control Z and it will just change back. Don't worry about it. Awesome. So I think we got most of the votes already in. So today we wanna tell you what's our framework for identifying the root cause. And for that, we brought it both of the down to five steps to identify the root cause. So first of all, you have to identify the problem and see what you're trying to solve. Clearly identify and articulate the problem or issue you're trying to investigate. If it's a conversion issue, where do you see the job off happening? Exactly. If it's an adoption issue, then again, you have to analyze the data and see if they're interacting with the feature at all. And then simply not moving forward with it or they're just not, there's a lack of awareness. If there's technical problems and try to pinpoint that. Once you know where it's coming from and what's the symptoms, then you can know what data you need. So you can start move to step number two and gather and analyze data. In this step, the point is to not deep dive into any specific cause or spend a lot of time and not boil the ocean, so to speak. So the point is to gather all the relevant data that you can analyze collected data on a high levels to start identifying patterns, trends and identify potential correlations that can point to the root cause. Again, without deep diving into any of them. And at this point, this step will help you go to step number three. And step number three, your goal is to hypothesize potential causes. Just have a list, dump them into a list of potential hypotheses or theories about the root cause based on the data analysis and your understanding of the product and its context and what you gathered so far. But that list can be long, it can be anything between five and eight causes to even 20, depending on where it is. So it's important to prioritize that list on step four based on impact and feasibility and your knowledge with the product. So prioritizing them based on likelihood potential impact and focus on the top causes that you think are the ones that are affecting the problem that you identified in step number one. And that's when you go to step number five and here's where you investigate and dive deeper into those top causes to gain deeper insights, understand more the prioritize causes with tools like session recordings, user research, heat maps and more. And this way we are very efficient because we don't dive into the whole list of everything that's happening because we can just shoot arrows blindly at any direction. We wanna be focused. So identify the problem, gather some data so you understand what are the potential causes for this but without deep diving, start having a list of hypotheses so you can prioritize them based on impact and likelihood and finally deep dive into those and you can continue the cycle if you don't find anything or if there's another problem to solve taking into account one of the challenges that there can be many complex issues. Now at Glassbox, we're able to do this because we have a really vast suite of solution. Everything from session replay, the heat maps, the product analytics so everything talks to each other and connects to each other and you can dive from session replays to product analytics, the heat maps and see everything connected together. So this really, really helps us in our analysis. Now here's some questions to look for so you can always reference this back some questions that you need to ask yourself if you don't know what to look how we use this interacting with my page. Are there any technical performance issues? And also by the way, this is not by any particular order you can just choose the one that you think are most probable. So are there any technical or performance issues? Do users focus on what you want them to focus or do they go elsewhere? And that leads me to where are they going from this page? Are they going to where I want them to go or are they coming back? Do we see a loop in navigation? This happens a lot with carts and plan pages. Are there any usability issues preventing them from using the site properly? Is an navigation intuitive and consistent across your product or website? Is the page mobile friendly? Is it responsive? Can it support smaller screens around the world? We usually design product teams and UX teams and very big and new screens on PCs or Macs and a lot of users actually have small laptops and they're using it on the go or mobile. So that can really affect it if it's not responsive. And finally, are we supporting accessibility features? Are we enabling everybody to enjoy and get the information from the website regardless of what accessibility feature they need? So welcome to everyone that just joined. Again, we can see we're using Miro today. So you can see the link to the Miro in the chat that my colleague just dropped in there and feel free to join us. So let's lighten it up again and take a breather. So what do you all think are the top reasons your users are not converting currently based on what we discussed so far? I promise after this we're going to start diving into the best practices and tips and then we're going to see some real-life use cases. So what do you think are the top reasons your users are not converting? And again, also you can choose as many sticky notes if you want and use the blank ones to add any other reason that I may have not added here. What are the URUXs, obviously? You know, inconsistent or non-intuitive navigation. This seems like a no-brainer for people but this is actually very common. Designers change, the company changes, you may have purchased a product, it was separate, now you combine them and there are leftovers or design debts, so to speak, and then you end up with inconsistent navigation. So I can really relate to that. And so- I actually want to, sorry. No, go ahead, James. No, because if you want to add a stick, you need to double click, yeah. I like the information overload or clutter content. This is another good one that happens a lot because a lot of times, especially in very robust systems and products, we want to tell them everything we have, but too much information, too many trees in the forest, you won't find the one that you need. So it's important to be focused. So the person that just joined us, we are using Miro here today. We have a link in the chat room, in the Zoom chat, so feel free to join us. So UIUX, inconsistent navigation, information overload, value proposition is not clear, can all relate to that. And interesting, that was one that was added as well. It's still moving, got some on to there, which that is interesting. There's definitely a common theme across the overall UI, whether it's overloaded or navigation, confusing interface, interesting. This is very good. Thank you, everyone, for participating. You can leave that segmentation there. The segmentation is also important. Personalization campaigns also are used to help with that. That's the industry is moving there more and more of information that will be relevant for one segment will not be for another. So I agree with that completely. Awesome, so I think most of the people are just voted. Yeah, all right, so let me take you with me and continue this presentation. Feel free to come back here and vote if you feel like the need. So I talked about, we're gonna talk about tips and best practices. And this is something that I think is really important because one of your challenges that you mentioned earlier that you resonate with you is having multiple factors for complex issues. And a lot of the time when you deep dive, you do find there are different challenges and different problems. And you may have different solutions. So the number one tip I can give everyone here is to validate your assumptions with A-B testing. Validate your hypotheses and optimize your product with one variation or more if you have that depending on the sample size you have. So you can make data-driven design choices and product choices and not work on hundreds and then see if something works or not. And if you have multiple issues or solutions, so it's important not to have several variations, different solution tested at once. Just test one variation, test one solution, I'm sorry, to solve one problem. And after the experiment is ended, you can move on and test another and make data-driven decisions. And I'll have you work on hundreds or assumptions and it also helps with communicating that internally to the company. So with the results that you're getting from your experience experiment. James, do you have anything to add for this one? Cause this is, I feel is a very important one. Yeah, I do too. I think it's A-B testing sit for the perfect validation tool and as you mentioned quite rightly there, getting, we heard earlier and saw earlier that there's lack of getting people on the same page, lack of buy-in, there was a post that mentioned which was leadership adverse to change as well. When you run A-B test, it can be really, really simple to display the data in a way which is consumable to understand this sport, this is after, you can validate it so clearly. And I think it answers lots of those earlier questions around getting buy-in elsewhere. Sometimes the risk is too much, there's too much risk averse and they're wanting to make big changes quickly. A-B testing gives you that platform to be able to do it at a much lower scale but still have the same outcome of getting people on board to push that change. So I also think it's a critical step. Yeah, I agree completely. Thank you so much James. Some more best practices. So I can't emphasize enough prioritizing potential causes based on impact and feasibility. I know that was step three or four in our five steps, but you can't boil the ocean, you can't have a list and you have to focus your time. So if you go and attack a list of 10, 15, eight potential causes, you don't know what are the top ones. So you have to be efficient here. So prioritization is very much important before you deep dive into each and every one and understand what's the root cause of those and where it's coming from. And you might not find anything on one of your potential cause and you see that's not an underlying problem at all and we'll see that here. But at least you have a prioritized list to focus on and if none of them was revealed anything, then you can move on to the next batch in the prioritized list. But as a general tip, because we don't have a lot of resources and it takes time to do this, prioritize and focus on the top ones. The second one is collaborate with cross-functional teams, bring other people in not only to help you but to brainstorm together to think of possible solutions. They can maybe think of stuff that you are not especially technical and UX and customer success. Everyone from different departments can contribute based on their knowledge and expertise. This is very, very important because a lot of times when we dive into this it's hard for us to see the entire picture. And a lot of times other peers can see stuff that we can't and can come up with different ideas or potential solutions or potential problems to the original problem that we're seeing. And as mentioned before, so test each potential solution separately. If you test them together, you wouldn't know what works. You have to have a separate experience for each and every one of those. And this way you can see that you make an impact or not and then move on to the next one. If you test two or three solutions or have a complete redesign of the page then you don't know what exactly help your users and made their lives better while interacting with your website or product. So it's important to test each one of them separately. And finally, document learnings. One, so you can talk and collaborate and communicate that internally. And the second, so you can come back to it. So if what you thought was gonna work didn't so you can come back, not rely on memory and see what you documented earlier. And if you encounter a similar problem in the future you can always come back and reference that as well. And it helps educate everyone in the company on the process that you took. This has been very valuable to me. And I'm sure these are not the only ones. But I think we're at the last questions for the session, the last interactive one. What are some best practices and tips, other best practices and tips that you can suggest for effective root cause analysis? So I have the ones that I wrote down here so you can drag and drop the sticky note if that's relevant to you. And add some more if you hear the help. This is to help everybody to get good feedback and good tips on what can help you in your root cause analysis. And good data is an obvious one. I should have added that, nice. Five wise is very important. And I agree completely getting to the root cause. You have to get a five for five wise on you. Isaac, you're a big advocate of the five wise. Yeah, I am asking the five wise, yeah. A lot of times when we have a brainstorming in the product team in Glassbox, I stop everyone. I'm like, wait, wait, let's just ask ourselves, let's go with the five wise. Now, five wise to anyone not familiar is you ask yourself five times, why? So why do they wanna do something? Why are they doing this? Why is this happening? And it might seem silly when you describe it to someone that is not familiar with it, but it really helps you get down to the root cause of the problem or the pain point that the users are having. So that was a great one. By speaking to customers to validate, that's another important one. So a lot of times when the solution developing it can be very costly and take time. And you already have roadmap in place. So validating it with customers when you have just a mock up even an internal mock up or a wireframe and getting there by and can really help affect the roadmap and know that you're building the right team thing. It's adopting a lean product management mind frame. Another really good one around affirming that the way you're measuring the metrics hasn't changed. And actually you always ask yourself as well, if you're looking at a metric, exactly what it's telling you, that's a really good one. Sometimes things change, something changes on the site, whatever it might be. And yeah, you've got to understand what that metric is actually telling you. If it's a ratio of something, what is it a ratio of? It's not just a percentage of all, it might be a percentage of the segment of customers, whatever it might be. It's trying to understand exactly what that thing is. So you don't end up down a rabbit hole of trying to investigate something which doesn't exist. I agree completely. That's a good one. I like don't be married to any one solution. So I know this in a different phrase. And I think James, you've heard me say it before. Don't fall in love with the solution, fall in love with the problem. This is something Uri Levine, the founder of Waves and Move It always says, he just released a book on lean product management. It's important to build the right, the minimal MVP, talk to customers to get more data and learn what they want and understand what is their next problem and what they want you to build and not fall in love with the solution that you're trying to build. So I like that a lot. Dropping ego, yeah, definitely. That almost goes hand in hand, actually. We don't embarrass anyone's solution. You can sometimes fall in love with it and want to, yeah, that's definitely be the winner, but yeah, absolutely. These are very good. Never works work, yeah. Oh, I forgot to show the cursors, everyone, come out. All right. So thank you everyone for adding the reading. That was excellent, I love this. And now it's time to get to the case studies. But before we do, does anyone have any questions about what we just reviewed and just discussed? Feel free to unmute yourself and ask us anything. Awesome, thank you. Yeah, hi, Isaac. I had a question. Can you hear me? Yeah, yeah, I can hear you perfectly. Yeah, first of all, thanks for a fantastic presentation and I love how you made it interactive. So you were focusing on how to, when you mentioned the five steps to identify the root cause, in step number two, you said not to deep dive into the problem. So I was just wondering how would you get data and not go too deep into the problem? Like how do you look for that data? So that's a very, very good question. At this step, you do start analyzing and looking at the data because you have to know and hypothesize what are the potential causes. But the goal here is to get a list of potential causes. And even if you know, okay, I think number two, that's the one I have a feeling because I know the product and I know the clients and I'm familiar with this and maybe it's a new product. So you're sure that it could be a bug maybe that's because it's a new feature and maybe it wasn't tested properly. So you will have a feeling that that's the one instead of deep diving into that quote unquote rabbit hole that may or may not be the right one. It's important to, at this stage, keep it at a high level, the analysis and investigation so you can get that list. And once you have a potential list, you can prioritize them and then deep dive into them. And if one of those prioritize the top ones that you focused on wasn't the real cause, you can always go back to the list instead of going back to the drawing board and having to investigate and starting at step one or step two all over again. Does that make sense? James- Thanks. Yeah, I think we'll cover this. Or actually, I'll incorporate this into the next part with the case studies. But yeah, there's a big efficiency peak between that third steps. Almost the most critical here is you don't dive straight into something straight away and spend a huge amount of time on something which doesn't give you an answer. It is taking a step back and saying, actually here's what I have in front of me now. What could be the potential problems and then taking those in a list where you think, which is the most likely outcome which has the highest feasibility and kind of working through it as I've mentioned. We can cover it up in the case study section. You'll kind of see how this process can be practiced through actually doing it in a real worldly sample. Got it. Thank you. And the goal here is to be efficient and not boil the ocean due to the lack of resources that we usually have. And you'll see it in a real live example now from James. James, I'm gonna hand it over to you. Sure. Yeah, so the way these things come to life is a dimension, seeing them actually and understanding how other people could have invented this kind of practice to be like. So the case study we have, the first case study we have is from one of our customers who is a bank in the UK, one of the largest retail banks and we're gonna kind of talk through that five-step process for a case study which they had, I think actually you can find this on the website if you want to find more information but it's one that we can kind of talk you through and explain exactly how this looks in the real world. So starting off, that first step in identifying the problem and this can be typically quite the high level problem, a business level problem if you like. So the business level problem that we had with this particular bank was we are seeing a low conversion rate for users opening a new account. So this is the problem that we are starting with and we just mentioned actually we've been talking about that set two and what does that actually look like? Now, I think I look at this personally as a situation where you use the information which you're comfortable with. So you use what you're comfortable with, you use what you're familiar with and you get to a point where you analyze what you have in front of you right now without spending a huge amount of time and investing in kind of multiple areas and multiple tools, look at what you have available there. So in the case for this particular customer they had funnels where they could identify where the problematic step of this particular sort of funnel or journey was if you like to understand a drop-off location. I mean, people not completing that particular funnel. They then correlate that with user journey information which gives them an idea of if people are dropping off and not continuing, where are they going? There's a correlation between what people are doing on one page and where they're going to. Are they seeing the thing they should have seen? Are they not seeing something on that page? Is there something wrong with that page? What's actually happening? Are they leaving the sites? Are they going elsewhere to get the information? And you don't have to do a lot of work. These are, in effect, two very simple reports to use. There was some click tracking available in this example as well so we can start to understand, for example, if the form which it was in this case, are people clicking to submit that form? Are we actually getting all the way through the form? I don't know whether people at this stage are filling in their information but I do know from the click tracking they do seem to be clicking submit to want to continue. They're starting to build a picture now of roughly what's happening on the page and there was some session replay watch here as well in terms of people who weren't converting. Just to see, is there anything obvious in the first five to 10 sessions that I watched? So anything obvious going on here which I can take and use as part of my investigation? Taking a step back, I've got what I have in front of me and I want to build those hypotheses and I want to build a situation where I ask myself the questions of things that either I feel, knowing the product myself, as Isaac mentioned, or I know that it might be something that might be impacting this further down the line. Is it a UI problem? Is it something that users don't understand what they're doing, don't understand what to do next? Yeah, we see it quite a lot and again, Isaac, you can vouch for this but we've seen so many times in funnels where people are expecting funnels to find information which wasn't there originally, particularly in the retail world, for example, all the time leaving private information and not completing the flow we're expecting. The onboarding form is creating friction. We know forms create friction. We want to make them as smooth and easy as possible but there might be points on that form which you know are problematic. And again, in this particular scenario, maybe it wasn't quite the case. Page isn't mobile-friendly. There's an error on the page. Then the simple things, unsupported language. Page is slow to load. Page is not accessible. But again, you start to build these hypotheses up and then you validate them a little bit against what you have and what you know which takes you into step four. Step four is that kind of next phase on where you then start to write them down in order which you think are almost like. So in this particular example, we knew people were exiting the sites from this particular set. We knew they were trying to fix the mitts. There must be something funny going on which isn't helping a customer kind of get through. This is creating friction for the users. It may well have been that they were clicking to get through but they weren't aware they had to fill out a particular field on the form, for example. Page is not mobile-friendly or a UI problem. So now we have our four steps to go and look at. How it then, to shape, the further investigation was, knowing that there might be an error, want to validate what we have in our console data, if you like, which I mentioned at the start, and that was to understand there were definitely clicks on the submit button of this form. There were definitely people clicking to get through. When you watch session replay to correlate that back, there was no error shown to the customer. But when they clicked submit, nothing happened. So then you move into the third step which was checking the backend errors. And this showed there were backend errors and when you read into those errors, because certain fields had been submitted with particular special characters in them. Accident mark, you know, characters that weren't accepted. It was the formatting of the field which was causing the problem. But the thing here is, you could say it's UX problem because we weren't highlighting the issue of invalidation on that particular field. But it's such understanding and prioritizing those, you know, issues in how I'm going to investigate them to work through these three steps to understand exactly what was going on, the route called effectively. So there was an error without any communication and finally we get to our route calls we found what the problem is. The onboarding form does not support accidental special characters. And I will tell you now that for a fact which is one of the most common things I see personally on forms when I speak to customers. It's such an easy thing to find and such an easy thing to fix as well. So I hope that kind of helps, you know, put it perspective those five steps and how they kind of look in the real world. Any on task Isaac? Oh no, I think that was great. Thank you. I will go on to the, I just want to add that this was a real case from one of our clients. Like James mentioned in the beginning and we've seen a real improvement in their account opening after this. So now I want to talk about something internally that we encountered in the last box. So we're not free of mistakes and we use our own framework on ourselves. So to make sure that it works and we always evolve this. So we saw a low adoption rate of a new feature that we just released. So immediately we went to identify the problem. It's an easy one. It's a low adoption rate of a new feature specifically called path analysis in the product. And again, we chose the two scenarios that we think are most popular usually for product manager. So because we knew this is a new feature we started gathering and analyzing the data at a high level. So first we analyzed funnels to better understand how users engage with the feature. We started collecting user engagement metrics such as usage frequency, time span and looked at user feedback to see if we can see anything that correlates. We analyzed user journey maps to identify drop off points and what users are doing instead of interacting with our new and fresh feature. And we reviewed click tracking to see where users are focusing their attention. The feature was hidden. It's important to give context. The feature was hidden behind a button as part of another product, another product in itself. And we started collecting relevant session replays on that specific product to identify patterns and behavior. And when we moved to step three we started to hypothesizing potential causes. So we came up with six ones that we thought was relevant based on what we found so far. So it could be complex user interface affecting user navigation. This is common with data solution and products because like mentioning before you wanna present a lot of stuff and it's hard to create a good user experience there. Perhaps inadequate onboarding of guidance on the new feature is the cause. The feature doesn't align with user needs or address a significant pain point. This is a hard one. Maybe we develop something that they're not interested in and they don't need lack of awareness or visibility into the feature. Lack of clear value proposition. Or finally, because it's a new issue so the immediate, a new feature the immediate corporate can be technical issues or bugs. So in step four, we moved to prioritize them based on their likelihood and potential impact. So the lack of awareness of visibility was the first one. We have a very good QA department and we all check our own solutions before they go to production. And we were sure there wasn't any technical bugs but we still added that. So first lack of awareness, inadequate onboarding and guidance ties really close to lack of awareness of visibility in this case but we still wanted to be a hundred percent sure. So we added technical issues or bugs to check that later. So first things first, let's check that the feature is working as expected. And indeed in step five, we deep dive and we found that the feature is working as expected across different platforms for different stakeholders and people that tried it. We even talked to some clients and we saw that it is working for them when we guided and drew this. There was no technical problem behind. Funnels show that users who see this page are not even clicking at the feature button. So users are coming to the product or interacting with the product or not even clicking on the feature that is hidden behind the button like I mentioned before. We looked at the user segments and we saw that the low adoption is not specific to any particular segments so not mobile or something like that. It was a general low adoption trend that we saw so it was not specific to anything. And Heatmap showed minimal interaction with the button leading to the feature. So there was no clicks on the button leading to the new feature itself. We looked at session replays finally, relevant session replay from that area in the product and revealed that users are not even interacting or exploring that button. We hope for them to explore the button and see what it is and maybe judge if it gives them value or if they can understand it but they didn't even do that. And if I can comment on it solving a real pain point so we did a lot of user research beforehand and after releasing it. So we were very confident in it that it solves a real pain point. And I can tell you personally as a product manager that is using product like Last Walls for years in all my career that's a product that I really like. I wish I always had that myself as analysis. The ability to have a journey mapping and define a start and an end for journey map and see how users go from the first step to the last step throughout any journey. Finally, after looking at all this and analyzing the data, we were able to easily identify and I think you already know it by now. So the feature is not prominent enough. It's hidden behind a button. It lacks visibility. There's no label to the button and the icon wasn't clear on what it does. So there was nothing to tell users, hey, there's something new here, something that gives you value that you should click on and use because you know it. So in this case, the top two hypotheses that we thought of both lack of awareness and inadequate onboarding. And this was due to the feature not being prominent enough and lacks visibility. This was almost the end of the presentation. We have two steps left. First of all, it's a Q and A for everyone that has any questions. And finally, we're gonna have a chance to get your feedback on the sessions. So if anyone has any questions, now is the time. I have a question. Hello, my name is Ann Viren, I'm an aspiring product manager and thank you guys for hosting this session. My question is with root cause analysis, is this something that you do once at the beginning in the ideation phase and then keep going as you learn more or is this something that you continually do as new problems or new issues arise with your user base? With something that's a great question actually, it's something that we continually do also in ideation because we wanna understand what currently we wanna build, what problem we wanna solve, but also when we identify problems or simply maybe there's not a problem. Maybe there's just a very low adoption and sometimes when we look at a problem, we can say like, okay, conversion is at 5% and now a job to 1%, so that's a problem. But maybe it's just steady at 1% for a long time and you know that you can improve that. So this is a framework and a methodology that we incorporate and we use it all the time and we really recommend everyone in the company to adopt that mindset of analyzing root cause. That's a very good question, Ann. Thank you. Hi, Isaac, I'll be checkier, can you hear me? Hi. Hi. Sorry, I just switched on the video. Yeah. So in your problem in the glass box, for example, this adoption rate or the engagement, so which software do you use basically to get all this information, like Google Analytics or any other thing? Well, we believe in dog fooding, which is using our own product. So we use our own product on ourselves to make sure to actually solve the product. So we use glass blocks for all of that. Okay, okay, all right. Because we have internally both product analytics and funnels and session recordings and heat maps and a VOC solution and we're really a one-stop shop solution. So we were able to uncover a lot of data. All right, right. Because most of my session I have seen the root cause analysis only on the theory-based and these are very rare occasions that we come up with a case study and it's really easy for us to understand along with the theory that there are case studies also. So understand each step why it is important and thank you so much. No, thank you, I'm happy to hear it. Thank you so much for this. Any other questions? All right, so I hope you enjoy this and this is, I would like to ask everyone, you can tell me how useful this session was for you. So it's same method as above. Simply take one of these dots here and drag it to one of the smileys and at the bottom you have a chance for retrospective telling us what you liked, what you wish could have gone better and maybe some new ideas and answer questions that we can cover in a future workshop. So feel free to tell us exactly how we did. We'd love to hear it. We'd love to have another session here at some point in the future. And thank you all for collaborating today. Oh, so I see Chris, you would like to see the interface on Glassbox. So it's a very robust solution. I'll take that into mind in the next workshop. There's a lot of products within Glassbox, so there's, but I'll take that into account. Also for anyone that missed it, there are dots here at the top. So you can, I like this cursor feature. So you can just drag it to however you feel the session was. I'm glad to see the session was very useful for everyone. We worked really hard on this and we tried to give you practical guidance and tips on what to do. So you can always reference it back and if you're stuck or if you're doing a root cause analysis internally in your company. So this is great. There's a question just up in the chat as well. You're taking all the dots and now putting it on what you liked and not. So now there's a voting on that. That's great. If you just jump in the chat, Isaac, there is one question. Have you seen examples when root cause analysis would not be a good choice due to time pressure? So first of all, thank you for saying it's a wonderful session. You would still want to do it due to time pressure because otherwise you run the risk of focusing and trying to solve the symptom or trying to solve something that's not really causing the underlying problem. So when you have time constraints, time pressure or not enough resources, but that's where prioritization comes more importantly and more focused. So up until it's easy to understand what the problem is and gather some data. And if you have all the data readily available, then that step is really easy for you because it's at your fingertips. And if you have the right tools for that, but prioritization, and instead of focusing on four items, five items, just focus on the one that you think based on your knowledge with the product and your knowledge with the users or the feature, the one thing that you think based on the initial data that you looked at is the underlying cause and then deep dive into it. Otherwise, because developing it and creating an A-B test and designing it and running it through UX and then having an experience for some time, that can make you spend much more time than not doing that initial validations and due diligence at the beginning. So I really like your answer because I'm full heartedly with you on that. What I've failed at doing, and I've been doing RCA thanks to being surrounded by engineers since 2010, but it's been harder for me to actually give that to a business that has not been as proactive as it should be, like explaining the value. So I wanna understand how you would explain the value of something like RCA in this case, i.e. saving time, saving effort, actually addressing the problem instead of and giving the customer the right solutions instead of giving the customer what they need now. Well, that's a great question. First of all, you can invite them to my workshop next time. We'd be happy to talk to them and explain the value for them. You probably have maybe experiments, I don't know your products, so I'm just gonna run ideas here. You have experience that didn't go so well. So you waste the time there, so you can tell them and communicate, look at all the time and what they'll be wasted here and we got no impact from these experiments. And perhaps you can do that root cause analysis once and show them results and show them, look, I have data to back up what I'm saying about what is the root cause and now that I have the data to back it up, we know exactly what to solve and here we ran an experiment or released a new feature and now we know that it actually solved it. So back it up with data, data talks without data then it's just opinions on everyone. And usually I try to be a person with no opinions, I have my hunches, but when I communicate with external stakeholders and try to get a buy-in, I always go for the data and back up what I say with that. So once you have the right data points for this, this is why the analysis is important, they can't argue with it, it's data, that's why I like it. Does that help in a way? It does and in my situation, they are against the data, so that's what I'm trying to get more insight. Thank you so much, actually. You're welcome. And I'm happy actually if you want, we can get in touch on LinkedIn and this is an invitation to anyone if you want to further discuss any of the ideas here, I'm happy to do so. And whoever said, please come up with more sessions like this, definitely we will, thank you so much for this. Oh, and I see we are at the hour and so many people still here. I really appreciate your time. This was really fun for me and I know for James as well, thank you everyone. We really appreciate coming here. We're definitely gonna have more workshops and appreciate the participation with everyone.