 Hi everybody and welcome to this month's UX Research Functional Group update. So this month I thought I'd do things a little bit differently. So rather than me just talking at you, I thought I'd make my update a little bit more interactive. So I'd like some audience participation please. So we're going to play the UX Research Quiz. It's easy to play, the rules are very simple. I'm going to ask you a question about some UX research which has taken place since my last update and I'd like you to try and guess the answer. Now it's easier to play if you have your own copy of the presentation open and some of the answers are spread across multiple slides. So it will allow you to just flick back easily and stuff. So I want you to enter your guests into the chat window, so ABC or whatever, and you have one guest per question. So I'm going to give you a couple of seconds to think about your answer and then I'm going to give you a countdown. So all answers must be in the chat window before the countdown ends. Now the rest of the UX team has an unfair advantage. They already know the answers to these questions, but I'm going to give them a choice. They can either answer honestly or they can try and convince you to guess incorrectly. So you won't know whether they're telling you the right answer or not. So it's totally up to you whether you choose to trust other members of the team or not. So for obvious reasons there's no issue numbers or answers on this presentation. However, I've got the slides updated and ready to go and they'll be added at the end of the call. So does anyone have any questions before we begin? No? Okay. Question number one. How many people subscribed to the UX research panel? So is it A-901, B-964, or C-908? So as a hint, in last month's functional group update, there were 887 people subscribed to the UX research panel. So if you think it's 901, enter A into the chat window now. If you think it's B-964, or 989, enter C. Okay, you have three seconds left to answer. Two, one, times up. Now if you said B-964, you would be correct. If you remember last month, I hope to reach 1,000 users. I didn't quite get there, but I saw lots of you sharing the link to the research panel on your social media channel. So thank you very much for that. So an additional 97 people is still a great result. Let's keep sharing the link. It's on the presentation slide and people can find all the details that they need to about the panel on that link as well. So question number two. Users were asked to import a GitHub project into GitLab. Of the following designs, which allowed users to complete the tasks with the most accuracy and speed. So we have A, which was how the page used to look. We have B, which is a tab design, and you can see import projects is here. And finally, we have C. If I can get the slide up. We have C, which is a wizard design, which takes the users step by step through the process of importing a project. So to recap, users were asked to import a GitHub project into GitLab. Of the following designs, which allowed users to complete the tasks with the most accuracy and speed. Okay, three seconds left to answer. Two, one. Okay. So if you said B, you would be wrong. And if you said C, you would be correct. So I tested all three designs with 144 users. So the accuracy. So this is people who clicked in the correct location to import a project. So the results for that were A was 82%. B was 94%. And C was a whopping 100% of users got it right. So in terms of speed. So this is how quickly people click in the right location. A was six seconds. B was 5.7 and C was 5.2. Now it might not seem like a huge difference, but any tiny bit of time that we can save users is great. We've actually already implemented design B, although it didn't perform as well as design C. It was still better than what's already existing. And it was actually quicker to implement as well. So we're looking towards implementing design C at the moment. So the team is very much working iteratively. So question three. Users were shown the bottom of an issue, Fred, and asked to identify whether they were looking at an issue or a merge request. What percentages of users correctly identified that they were looking at an issue? So users were shown the image that is on the presentation slide, which we all know is an issue. But I asked users what they thought it was. So A is 63%, B is 71%, or C is 86%. Which was it? Okay, three seconds, two seconds, one second, times up. If you said C, 86%, you would be wrong. If you said B, 71%, you would also be wrong. So A is 63% of users recognize that they were looking at an issue. So 22% of users thought they were looking at a merge request. 15% of users weren't actually sure what they were looking at, and they chose to pass on the question. So to give a bit more context, I spoke to 81 users, and they were a mixture of GitLab users and people using competitive tools. What was also quite poor was that it actually took on average 28.6 seconds for users to answer that question, and that's pretty slow. So we're currently thinking of ways that we can visually distinguish between issues, merge requests, and potentially epics as well, so that users can understand at a glance what they are looking at. Okay, question number four. So users were asked to add a due date to an issue. Of the following designs, which allowed users to complete the test with the most accuracy and speed. So for anyone who's not sure, this is what an issue currently looks like, and you can see that the sidebar is gray. And then on B, the sidebar is white. That's the only subtle difference between the two designs. Okay, three seconds left to answer. Two seconds, one second, times up. So if you said A, you would be correct. Now you might be wondering why not if we tested just changing the color of the sidebar with users. It doesn't seem like a massive deal or a massive thing to test. Well, the aim of this test was to see whether users could align the content within the sidebar with the rest of the page content, so the issue description. So we wanted to make sure that they knew that there was a connection between them. And generally, we also wanted to try and make the page feel less boxed in. We found that a white sidebar performed worse than our existing sidebar color, and also that users had a higher propensity to click the edit call to action. So that actually makes us think that they have a less connection between these two types of items. So for now the sidebar remains gray, but we'll be trying other colors. So last question, question number five. Users were asked to move an issue to another project of the following designs, which allowed users to complete a test with the most accuracy and speed. So we've got default, which is the move issue call to action is in the sidebar right at the bottom here. Oh, we have B, which is actually behind the edit call to action. Now, for anyone who remembers, this was originally where it was. So what's different with this question compared to the other questions is the testing took place after the change had been implemented, whereas the other questions, we did the testing before anything had been implemented. Okay, three seconds, two seconds, one second, time's up. Okay, so I have a confession. This test was actually inconclusive. Now, when we run this testing, because the change had already, we basically run this test and because we found that our own team were confused after the change had been implemented, but also that some users struggle to locate the move issue call to action. And we found that there was a lot of people opening up issues about it. Now the results were accuracy wise a produced 52.5% and B was 50%. Now neither of those is particularly high and speed wise a was 18.7 seconds and B was 16.8. And again, neither of those is particularly fast. So if accuracy and speed for both locations is low, that tells us that neither location is intuitive to users. So when this test was run, the move issue call to action had only just been implemented and sometimes people need to get used to wearing a new feature is. So in this instance, we're going to monitor the situation and rerun the test again shortly. And if the results are similar, then we know we need to test another location for the move issue call to action. And if they've improved, then we know the location is right, but we need to do a better job of drawing people's attention to things when they've moved in the interface. And then obviously I talked about changing the sidebar color. So this is a good example of where one research test might impact upon the results of another research test. So we need to constantly be aware of everything that's changing in the interface so that our test results aren't skewed. Well, that's it for today. So thank you very much for taking part. I can see loads of letters flying up the chat channel. So thank you very much. Do let me know whether you prefer this format or not. Let me know if you have any improvements that I can make. This is the first time I've done it. Thanks guys. Does anyone have any questions about the research? Yeah, prices. Prices maybe next time. What are you looking to research next Sarah? I'm doing a card sort at the moment. So that is to sort out the content that's in the left navigation. So the contextual navigation. Okay, so there's no further questions. I just want to mention that we're currently hiring for a junior UX researcher and also two UX designers. So if you know of anybody suitable for the roles, then please do get them to apply. So thank you very much everyone and enjoy the rest of your day. Thanks Sarah. That was awesome.