 The difference of disciplines expertise in the room. Who here is a designer or a researcher? Also who does something who's done research before? I tried it. Great. OK. All right. Good. So you got some some baby experience in it. Are you all working on in small teams on the new products? Yes. OK. Lots of nuts. Great. Let's get to that, OK? So, this is for you then. So we're going to talk about what user research is and definitely what it is not to try and blast some myths. Why you should do it. Sasha's going to give us some information really about cognitive biases and how we work as humans, especially when we do our research and talk to users as well. So what might be going on in the user's mind at the same time as you interview them. yma yng Nghymru, wedi dysgu mewn i'r ddweud i. Mae'r dweud i yw'r amparadau yn y ddweud, mae'n gweithio'n ffocws ar weithio ar hyn. Felly, byddwn i'n cael ei ddweud i'w bydd ymwneud, ac yn ymgyrch chi'n gweithio'r ddweud i'w ddweud i'r ddweud i'r ddweud... Ymgyrch chi'n gweithio'r ddweud... Mae'r byw peth yn ymdeud wnaeth gweithio ar hynny o gweithio a bywch yn ymgeiswyr a rhoi'n gennymol ar y ffadoedd ymddangos o'r hunain o syniad gennymol argyrchu. Mynd dod i fynd yw'n gweithio a'r byw o gwasio ni a mwyn o hyn o bair i fynd y gallwn ni, fod yn ymryd i'r lleinol mewn pryd yw'r bwysig. Mae'r ddweud i ni'n deall. Fy hoffi'r sgiliau yma, dwi'n amser yn ddyfodol yng nghymru o'ch byddai'n bwysig i ddweud i'r ddaiddio'n bwysig i ddweud i ddweud i'r ddweud i gael o'r sgiliau gyrthodau hynny. Rwy'n addysg y gallwn, am ydym am yw'r ysgrudio, rydym yn y tymor ffordd yma wedi bod hyn yn ffrindio. Bunwch, i bwysig, a'r hynny'r dyfodol yn ei fyddiad y gwrthbwynt. Ond gwybod ni'n amser yna, ymrwyngfa'r gwrthbwynt yn rogiwg, o ar-rhyw o Creuёт yn y cwnsensus ar y rôl, neu'n yn ni'n gweithio'ch cyflawn roedd fod yn yr anwerth. Cymni'r hyn oed yn ôl yw. I have been doing user research for about seven years now. I worked at a number of different e-commerce, financial, government projects and before that worked at Europe's leading usability testing company as well. A consensus I work across a lot of different spokes helping do research with a lot of different users including with Sasha and I also helped train designers to do research as well. So hi, I'm Sasha Danece. I've been working as a designer for nine years now. Right now I'm head of design at Aletheo consensus and in my role there is to do user research, user experience design, user interface. I champion all the time user research and this is why sometimes I'm the best copy of my team and I stubbornly stick to my opinions about users and why they have their own rights. So that's me. Okay so there are actually enough people to do this in pairs which is a bit better than this. So you could just grab somebody just to find out a little bit about them and their experience of doing research. We'll literally do this for just a couple of minutes afterwards. So I'm going to set a timer. There's two here, there's two there, it's two there and you guys can make it three. That's something like that. So if you just introduce yourselves to each other, you can find out what they know about us. So I have a couple of such experiences which one is to do some overview of what we're going to be doing in the future. So I have a couple of such experiences which one is to do some overview of what we're going to be doing in the future. So I have a couple of such experiences which one is to do some overview of what we're going to be doing in the future. So I have a couple of such experiences which one is to do some overview of what we're going to be doing in the future. So I have a couple of such experiences which one is to do some overview of what we're going to be doing in the future. So I have a couple of such experiences which one is to do some overview of what we're going to be doing in the future. So I have a couple of such experiences which one is to do some overview of what we're going to be doing in the future. So I have a couple of such experiences which one is to do some overview of what we're going to be doing in the future. So I have a couple of such experiences which one is to do some overview of what we're going to be doing in the future. Is there any of their biggest pains doing user research or the biggest problems that they've encountered? And then what we can do, we can try and focus on that towards the end of the session to tailor it to you. So has anyone got anything there dying to know about? I think you said that you were interested in finding people to talk to, how to go finding users. Anyone else with anything else that they're interested in? All of it. Amazing. Okay, cool. All right. So what is user research? And we're not going to go into too long about this. You've all got a sense of it, but I just want to explain. This is Margaret Mead. She's a very famous anthropologist. And she famously wrote this, what people say, what people do and what they say they do are entirely different things. And she discovered this by observing people in the natural environment and interviewing them separately and doing surveys. She tried loads of different methods. This was in the 30s and it was quite influential at the time and also quite controversial too because this was around the time where man was meant to be master of his mind, right? So what consumer behaviour and science around that has proven this to be the case? So people are not very insightful about their own habits, behaviours, motivations. They need to help eliciting that out of them and that's the job of the user interviewer. So what is user research? We're talking about qualitative research here. You could go and do big data surveys and we'll talk about that in a minute. But it's usually interviewing or observing real people. And what we're looking for is identifying needs, behaviours, goals, motivations, pains, all that good stuff which tells you what you should be building and understanding the why behind the what they do, right? What they do is interesting but it's hard to make informed decisions based on that. So what we want to understand is what is the intrinsic motivation behind people's behaviours. And then the whole purpose of research is to have an action plan off the back of it, right? You've seen these patterns and themes across users and now what you're going to want to do is come up with some recommendations and actions and we'll actually talk at the end about how you can help do that with your team as well. So what user research is not? It is not about people's opinions. It is not about whether they would prefer the yellow object or the black object, right? That's kind of market research as well. Market research is about asking many people the same set of questions and actually you end up learning a little about a lot of people. The benefit of qualitative user research interviews is you will learn a lot from a small number of people. And data analytics, site usage, what's happening with your product. That is very interesting what data, okay? But there's two problems. First thing in the blockchain space is that most apps or dapps or services don't have many users. So if you don't have data then how else are you going to find out what people are doing? A lot of companies are actually intentionally not building in any analytics product into their product of the basis that they want to keep it pure and like to make sure that the users' behaviors are not being tracked and things like that by any third party like Google Analytics. So data analytics is useful but you need like a lot of users for that to make sense. Otherwise you're basically just looking at what all your development team is doing on your website. And it's not something which I've experienced before. We've had 100 users of everyone in the team. So something you only do once you've built the thing, right? This is the wrong time to be doing research. We need to be doing research at the start of the thing. Really before you've built anything there is a bias towards action and a bias towards technology within blockchain within the ecosystem which is let's get building something. But I question the value of building the wrong thing. And I really like to encourage people just to at least do some research to start things off. So why is being able to interview users important to you? You've probably heard the statistic somewhere between 50 and 90% depending on what article you read. Between 50 and 90% of all startups fail in their first year, right? And then the remaining lot start failing and only a small proportion last to year 5. So based on that information most people in this room are working on a failing product. Now why don't people just to think about that for a second, right? And like let's be open to this because we live in a world where we have to we're cheerleaders for our product all the time. We're in our own little echo chambers in our bubble site. Yeah, it's going to change the world. Well, we don't know that to be the case. And the way we could de-risk is doing research early. So just get some done first, right? Before we build the product and find out it's the wrong thing. And that's the whole point. It's saving time and money by doing this. It can really help align people around the problem as well that they might have. We're in this early stage of this pioneering technology and it's certainly in the way that people and businesses interact with one another. But because crypto and blockchain, the world is relatively small and highly technical, it's easy for us to think that our users are just like us, right? Oh, we're building a product. Who's the product for? Oh, it's for people like me. That might be true, but it's guesswork until you go and find out, right? So I'm not saying that you're wrong, but maybe just go and verify that because not everybody speaks in the same language as us. And I mean technical language as well as nationality and other language as well. So that's really important. And probably you don't have a dedicated user researcher on your team. Most teams don't have someone like me or Sasha to come and do all the research for you. So don't just leave it until you're six months down the road and then you are your designer and then you get them to do some research. You do it first. I think it's a really great and empowering activity. So I'm going to hand over to Sasha to talk about some tech failures. So how many of you are working right now on product? Oh, many. How many of you have done user research for a product? Not so many. It's okay. You're not alone. Many tech companies forget to do that and they ignore user research. And this is called tunnel vision and that's like a deadly thing. Design ignores users and users, if users feel that they are ignored by design, they are going to ignore the design. So I will show you some heartbreaking testimonials. So this is from e-crowd. They say we spent way too much time building it for ourselves and not getting feedback from prospects. It's easy to get tunnel vision. I might recommend not going more than two or three months from the initial start to getting in the hands of a prospect. So if you are working on your product and it wasn't like more than three months, it's alright. You can do it now. And this is from Voltertide. We didn't spend enough time talking with customers and we were rolling out features that I thought were great but we didn't gather enough input from clients. We didn't realise it until it was too late. It's easy to get tricked into thinking your thing is cool. You have to pay attention to your customers and adapt to their needs. So probably we all have been there. So what does bad user research look like? So the first type of bad user research is no user research. You may be familiar with this quote from Henry Ford. I've got a question mark there because actually the first attribution to him was in 1970, this quote. So who knows if he actually did say it. So that I've heard this from people in blockchain as a reason not to speak to customers. There's a few problems with this. It's also used to argue that true innovation comes from single and gifted individuals who know better than anybody else. I'm sure we all know people who kind of behave a bit like that. But the problem with using this quote as an argument is that this is not user research. So what he's actually talking about if I had asked people what they wanted. We don't ask people what they wanted user research. That's capturing opinions. People are very poor predictors of their future behaviour. So this is just bad research that he's talking about doing. In addition, Ford, the motor car company due to their lack of innovation. So they obviously created great products but then all the innovation that happened after that was on the production line to save money to build cars cheaper. You may remember he was also apparently said that you could have any colour you want except black. So what happened was Ford cars the share of the market dropped from 66% in 1921 to 15% in 1927. So that's a six years massive drop in market share. So you can use that as a fact next time that somebody says that we don't need to speak to customers. The competition was innovating in the space and Ford was not. And that's the reason why that happened. Maybe after today's workshop you will think twice about putting this out. OK, we've all done this at some point. I've made a thing. Please give me feedback. That would be amazing. So what happens? What sort of feedback do you get? Who answers it? Are there people who are your fans? Are these people who already know what your product is? What about the people who refuse to click on it? Maybe you should find out what they think because also talking to detractors is really, really valuable. And then this brings us on to surveys. So most surveys are absolute garbage. I'm really sorry to explain this to you. Now there's a few reasons why this is the case. They feel like they're a good tool. That's because they're ubiquitous. Every company is putting them out there all the time. You always get requests to fill out a survey. So by frequency it brings validity. The problem is that it's very easy to create a bad survey. Survey design is actually a science, right? There's statistics that goes into it. In order for it to really help you, you need to put a lot of effort into it. But it's really easy just to throw one up. What should we ask them? What do you like? What do you dislike? Send it out, OK? And I think surveys are often used instead of user research because talking to people is scary, right? So it's like we want the survey to do the talking for us. And we expect the survey also for everybody to fit into these neat boxes that we've already determined for them. But people are messy in a office, OK? So it's not as simple as that. And I'd like it just to challenge this a little bit further. It's also to think about your own use of surveys. So when you get sent a survey, do you fill it out? Why do you? Why do you not? Is it because you have a particular relationship to that brand? What happens when the survey is too long? Do you abandon it halfway through? What happens when you see a question that you're not really sure what it's asking you? And you don't feel that the survey is asking you the question you really wish it was asking you. And then you get asked to rate something between 1 and 10 and what's the difference between a 7 and an 8? Someone's actually going to do some number crunching around that. They haven't thought about the science of it either, right? So they're not that useful, OK? I hope you're putting that across. And also we are defaults to... We have confidence in large sample sizes, right? So we think that the... Well, it's better than speaking to a few people. We're going to get loads of survey responses and that will give us confidence about the answers that are there. So firstly, if you're... Do you get large responses? I would be questioning around that. Put out loads of surveys where you get like 15 responses. Does it really help you make decisions? There's a big question around that. And also if... Let me just see. It's only true that they give you more certainty if you're doing a robust recruitment method to recruit a very large group of representative people of your users and then randomly selecting from them. That's the way to have accuracy and large data. No one does that when they send out surveys. They just blast their customer base and hope that some people reply. So we guarantee... Sasha and I will put money on it that if you go and speak and have deep interviews with five people you will learn a lot more than you will surveying and getting a response from 20 people. I promise. So another type of bad user research is doing it with those who are easiest to access. The people who love us, they're great, they're like archie leaders, they want us to do really, really well. The problem with this is that they're probably too close to the product. They may speak the same language as us in terms of the terminology that we use for things. They probably have an idea of who you are and they want you to do well. So they're probably not going to be very honest about it. But there is a use case for doing this. Practice your interview with them. They're great people to practice questions with. So don't feel scared to do it. It's just these aren't your users. This is just other people's opinions as well. So yes, I'm going to hand over to Sasha. She's going to talk about some brain hacking. Yes, what we're about to touch now is some sort of brain hacking. We will get to learn our brain's vulnerabilities and how to turn them into opportunities to put them onto our advantage. So I will run you through some cognitive biases and feel free to raise your hand if you spot them and you have experienced them. What are cognitive biases? They are mental shortcuts our brain takes. This means that they are unconditioned and we can't avoid them. But what we can do is learn how to spot them by knowing more about this matter. Cognitive biases are loving us all. Also researchers are their victims. So why it's so important to know about them is because as researchers we might spoil our work. So let me give you an example. You are doing research, you have some assumptions. You are super thinking that that's the true assumption and because you want to really probe that you are formulating your questions like just to probe them. So you are influencing and biasing your user by articulating not very well the questions. But not only researchers are the only ones affected by biases, users are humans too. You have to bear that in mind and when you do user interviews don't take their words for granted just try to live between the lines. So we'll start with the response bias and I'm quite curious how many of you have experienced it. I had it myself many times. So the response bias is it's caused by the way we collect data. This means that you can bias your entire research by the way you articulate the questions. Sometimes this happens when the development team contacts the research it's very hard to step away, be objective especially if the responses you are getting are not what you are going to hear. Have you experienced it? Have you went through that? How many? Put your hands up. A very sincere crowd, I love it. Why do you specifically say that the development team contacts the research? Like everyone who works on the product this is the development team. I'm not showing... It's not simply the developers it's just the developments of the product. Okay, so get into our next bad boy. Is the desire for harmony in a group which results in dysfunctional decision making? We want to minimize conflict and rich consensus so we might suppress opposing viewpoints and isolate our team from outside influences. So we love our product so much so we don't want to hear anything outside of it. Confirmation bias. This is the tendency to search for or interpret, focus and remember information in the way that confirms our beliefs. So this happens when you do user research and I don't know, you only hear the answers you want to hear. So even though 95% of the users tell you something else you only want to hear the 5% so you get some supervised information that happened to me at Alethio when we were working on our blockchain explorer we were so convinced that our core users were the power users so we only wanted to hear that. Course of knowledge effect. When better informed people find it extremely difficult to think about problems from the perspective of lesser informed people. This is when we don't really use we use the terminology and think that everybody should really understand about it. Think of it, you're working in blockchain, right? And you talk to everyone with the terminology and expect them to really know what you're saying but it's not alright. You don't have to make your user feel dumb. Don't challenge your user. Even if they are an expert, just play along with you. Social desirability bias. Isn't this one of the biggest issues right now? Yeah. I mean, this is actually very... You can encounter it all the time when you're doing research. No, but I mean, specifically in blockchain if you look at UX right now you go through all the UX and you have never heard of blockchain and you confront them with old apps and onboarding. It's very ridiculous. They don't understand anything which is totally good. To me it makes no sense. But yeah, then it's a new user. They have no clue what they're talking about. Yeah, it's alright not to know. Yeah, but I think that's one of the biggest challenges for us in blockchain is to have people that have no clue what blockchain is. If you onboard them and make them successful. I think this is our role to do education. Social desirability bias. This is about users, your interview might give you the answers they think you'd like to hear over their true beliefs because they want to plead you. So this always happens to me and in order to avoid it when I do a user interview I tell them I never worked on this design don't worry about it the people who are working on this are not part of the research so you ease them in so you make them feel more comfortable to be honest. Hawthorne effect. This is when people know they are being observed they might change their behaviour they might think they are being tested in an interview instead of just listening to it. So a tip for this could be to tell them all the time and remind them every now and then during the interview you are not tested, only the product is tested. There once happened to me when I was testing with a user he didn't know what an ABI code was and he just started to be very reluctant to answer to my questions and I didn't want to feel bad so you can avoid it. IKEA in fact I think over designers and the development team is affected by this we place a high value on products we created or worked hard for let your users do something as part of the onboarding process not too hard but rewarding so they can connect with your product let's stick to that we place a high value on products we created every time you do research or you get feedback because you love so much your product you tend to not hear what people are telling you but there is also a reverse type of thing actually you can make this bias as an advantage as a researcher or a designer you are the designated messenger for bad news so you get the bad feedback and you have to tell it to your development team and in order for them to feel not as if I went somewhere I have invented this, this is magic they are all left out you invite them to take part in the user interviews whenever they can or witness everything that's happening because UX has to be transparent we love transparency in blockchain so we have to make all the processes transparent this is about the fact that everyone can do user research it's a skill that you can learn you're not born with it and you only have to get in the mindset of a user researcher and that means that you have to be a detective think of it, you're the shell of homes of your product so you have to pay attention to all the details ask all the WH questions you think you need in order to get all the answers and the insights also think that you have to be a psychologist you have to understand how people are working and why are they doing that and just try to see what tickles them and also you have to be like on a safari you have to observe people but not to be an active observer just try to take a step back not to influence them with your opinions, your biases it's not a heavy betting zoo it's just a safari the first thing that we're going to do is when we're going to start getting into activities now so I hope you all had your coffee and woken up so what we do at the start of any research is stating our assumptions this is a really valuable thing to do with your team so the assumptions are basically what do we think we're going to find out when we start interviewing people that's what we're going to put down they don't have to be right you don't know the answer to these things but they're a bunch, they're a guess I think probably this will happen I think X because Y it's really good to do this as a whole team exercise so using that Ikea effect again get people involved early and care about the outcome as well and then also when you do it as a team exercise you start surfacing where there is disagreement where there are conflicting views about beliefs that people in the company have those are great places to focus the research on so if you've got a debate in your organisation about which way should we go with this particular product feature then interviewing people can help solve that question really so we state it like this I think X because X, Y and it's fine not to agree about them so just a basic example and Sasha's going to hand out some things to you all as well so an example research question let's say you're building a decentralized exchange where's the gap in decentralized exchanges now which could be an opportunity for us quite straightforward so an assumption might look like this I think what's missing is a mobile app because if users can access the decks on the go this will lead to more usage and more liquidity on the exchange so Sasha's bringing out some things here we need to get into groups of three so if you can organise yourselves in groups of three please you're a two so we're just going to spend probably only about 10 minutes on this activity and I want you just to work together to state what your assumptions are about the thing I've put in front of you so in your group you should all have the same thing you all have the same question so have a look at the research question and just write your assumptions down you don't have to agree with each other just word them out so try to come up with five at least and state the big cards as well so what we're thinking is if we were going to interview someone about this what would we expect to find out all that I think is different that's the only one exciting thing I think people have a difficult time connecting what would be the answer what would be the answer I'll say that example I guess they've got two observes here yes that's hard I'll do that it's a special tool they travel journey from home to destination okay I'll help you I'll be something that people can get we know the focus on the users what do you want me to do? so you can say not for each other to stay silent just for a minute but only because I like you sorry guys sorry bro I wish you'd just to make the answers for the users what do you want to get from the research what do you think what do you think what do you think what do you think we can have the user perspective that will help and also we can have the researcher perspective so we can get from our interview our session so if we can interview what I need to know of the user so what kind of question is that going to be that's what I understood so for example what do you think the main pain point in the tree is to know if you want to to create this to respond to this so the assumption is that that's so amazing what do you think I didn't think so so what do you think I think I'm going to answer I thought I would know my assumption what do you think so do we have the researcher perspective or the user perspective exactly right assumptions are only based on our experience so use your experience to be more of having done this thing right you think people are going to have a similar experience but yes so you would give it a both ways of finding what you think what you think what you think right yeah this is the question what do you think you are trying to do to come up with this new way to help people make this a better experience to make these better experiences for people so if you are going to ask people what do you think about their experience and it could be from your own personal experience that you are like oh well I have this issue that everyone else has the same issue as well so that's a good assumption to have okay yeah we wear customer shoes wear the customer shoes that's a really good question it should I don't know if it needs to in different way your assumptions around assumptions I think you can not use me as a validator because I need to talk back about any of the problems that you have but the answer is is it sad or is it hate and then you come to some issue because you have the brain that's really sad because you speak to people and they are like you know it feels like like magic just happened when this solution shines I was speaking around here when I wrote main paper into your input so what do you what do you think and here it's a right way once you buy the Wi-Fi otherwise you have to download all of our maps in the description we always feel like you're sort of like just connected and you want to stay and rewrite transport maybe create a few categories which one does it need I prepare that country does it need for example here when you try to do it something you have to use the solution The idea how much it cost in... ...of food and travel. OK, right everybody you can wrap up now. If you all got the sub assumptions written down which you've already done, so we're going to go through now a sort of the anatomy of an interview, alright. You will have in front of you some stickers. There's usually three roles in an interview, Ffosilitator, user, and an observer. Observer is really useful as the person who can take notes for the facilitator, the observer doesn't talk during it. It's important not to have a three-way conversation. You want a conversation between the user and the facilitator. You probably guessed by now, you need to assign yourselves with the stickers. Do that now, quickly, just to stick them on you. Who's going to be a facilitator? Who's going to be a user? Who's going to be an observer? All right, I'm a user. No, no, because you're going to be a lesser viewer. Okay, good, right, right, right. So I'm going to talk through some good versus bad interview questions. It's not a surprise, you're going to be doing some interviewing in a minute. You have not prepared for this, so it's going to be uncomfortable and awkward. That's fine. It's just getting a little bit of the experience of it. We've got handouts, which we're going to hand out in a minute, which have all this stuff on. Cheat sheets for how to interview people, what the questions are about questions. Good questions. They're open-ended. They're things like, tell me about, describe a time. Tell me more about this thing. Storytelling's quite good. Walk me through the last time you did X when you started to the end. But you need to be following up with why. Why was that? How come? Tell me more about that. These are all very curious questions that allow the user to answer from their own experience without really leading them. You're not telling them what you want to hear. They're very much like, you tell me. When you hear something that's particularly interesting, you can ask why about that particular thing. On the flip side, bad questions are closed and with yes or no. Future casting, everyone is dying to ask this question, would you pay for this? You can ask this question if you want, but the answer you get will not reflect reality. People are very poor predictors of their future behaviour. You need quite advanced market research to work out what people are going to pay for your product and how they'll pay for it and whether they will pay for it. Also, the best thing to do is to understand the pain. If you really want to find out if someone's going to pay for something, try and find an analogous situation where they've moved from a free product to a paid product in the past. What made them move? What was enough to incentivise them to start paying for something? When you can understand their past behaviour, then you can start predicting their future behaviour. But don't rely on them to tell you because they will not necessarily get it right. And leading questions, so we talked about this already quite a bit. They're questions which suggest to the user what you have in mind. A very simple example of this is, do you like that? Do you like that is a leading question. It's also a closed question, so enter the yes or a no. It suggests to the user that they should like it. Then there are some varying degrees of improving on this question until you get to a really good one. This is kind of a bit weird, do you like that or not? You're starting to have balance in there, like there's two ways it could go. This is really anal. Do you like that or not like that? You probably see survey questions phrase like this, trying to not leave the user and be balanced. But the best question is this, what do you think of that? Allow them to use their language to describe what it is that they're experiencing. So this is the way to get really good stuff out of people. Can I ask a question? Yeah, sure. Because you just made it like instead of like, instead of like, you use nothing. So you give the option to be positive and negative. But what if you specifically want to have like positive ones? Because if somebody is like always giving out negative things. You also wanted to focus on the positive, would you still use things? Good question. So you can do that later on. So the idea is that you start quite broad, get information from them and then start going in. So a good question might be what was your experience of that? What do you think of that? And then they start going negative, negative, negative, negative. Was there anything you liked about that? What did you like about that? That's perfectly reasonable. But it's just best to start allowing them to tell you. So we need to be grateful for their time and honest with the purpose of the research. Research is not the same as selling your product and the amount of calls I've been on where the product team is trying to pitch at the same time as doing user testing. These are not compatible at all, right? When you're pitching to a client, they want you to do the talking. They're not going to be honest about their experiences to you. Why would they do that? Keep these things separate. Make sure they know it's this customer. You're not selling them anything. You want their feedback, their responses to the questions that you've got. You need to be friendly but not friends. When we're friends with people and we've all experienced this, I love having great chats with my friends, right? But this has come to the social desirability bias. You have a wonderful chat. You'll all feel good at the end. But you probably are not being totally truthful. So it's about creating a bit of space to allow honesty to be there as opposed to just being chubby and agreeing with everything everyone says all the time. Let them do the talking as much as possible. If it starts slowing down, you're like, uh-huh, okay. Those kind of prompts. Avoid influencing them. And like we've said before, don't assume technical knowledge. Use plain language where you can. That's quite important. So we're actually going to demonstrate what a human research interview might feel like. So I'm going to have over to Sasha because she's going to interview me. So we're going to do a playing. So I will be the researcher. I'm researching a fitness app. And Georgia is my user that I'm going to interview. I will start by asking her broad questions about her just to ease her in and find out a bit about her context. Then I will start to ask her more questions about fitness, broad questions to see what comes out and listen and pick out some of the things she says and then start getting narrower and asking her more granular questions about the fitness app because this is my end goal. But I don't want to start directly by asking her that because I might not get so many insights that she might give me for I'm biased in her. Somehow she will think that, oh, I'm only interested in fitness. I don't want to know anything about her or her life or anything like that. So she will stick only to that subject. So, OK. Hi, Georgia. Thank you for accepting my invitation for the interview. I am Sasha. I am a design researcher. And I would love you to tell me a bit about yourself. I'm going to pretend I'm not me. I'm Georgia. I work in travel insurance. I live in London. I've got a husband and a cat. OK, cat. Nice. But what do you do when you're not working? So when I work from home, so I try to get out and about as much as possible so I'm not lazy at home. I do a lot of reading. I also do a lot of walking. I hang out with friends. And I also do yoga as well sometimes. So now she has mentioned yoga. So I would pick on that. So you mentioned yoga. Can you tell me more about that? Yeah, I go to a place called Yoga Space London. It's not very far from my home. And I go there maybe once or twice a week. I book the sessions online. I just buy a whole bulk of tokens. And then I can book what the sessions are on. So you're using a website for that London yoga studio you mentioned. What do you like about the website? I really love... Actually, the thing I really like about it is that I can cancel the sessions with no hardly any notice, just like two hours before because work gets in the way a lot. I like people just don't meetings in my calendar so I often have to move my exercise sessions around it. Okay, and now I'm getting narrower. And where else do you do yoga? I don't do yoga anywhere else, only at the studio. Why is that? I've tried doing it at home, but I can't really... I'm not sure about the space in the house so it's just hard to get motivated. So you see I get many pain points from her. She doesn't have space but I'm not interested because I do a fitness app. But she also said she doesn't get motivated. So now I'm picking on that. And have you explored any ways to get more motivation? So have I done that before? Yeah, I've used some fitness apps before but I just end up... I just don't really stick with them. And it's very good to ask many ways. Try to find everything about what they say. Now I'll ask her, why? Why did you stop using that? To be honest, I've got not a very modern phone and all the apps are just too large so I just end up deleting them. So boom, I really find what's her code. I'm starting to understand like some pains and potential opportunities as well. OK, so maybe we should make our app really lightweight so that people don't have loads of room on their phone and it doesn't get automatically deleted which is what Apple does now. So just some facilitated tips of the trade actually. We're going to share these with you. So the five why is this is root cause analysis. Are you maybe familiar with it? It's come from manufacturing industry I think. The idea is something came off the production line that's got a fault. Why is that? We made a piece of machinery, it wasn't working correctly. Why is that? Management had changed. Why is that? By asking why many times you start to get to the real reason. Now obviously you're not going to ask the user why five times. You don't want to sound like a robot. That would be a sort of conversation. But maybe once or twice we'll get to the real reason. If you remember that users don't have this why always there at this top level. It's not readily accessible. So you have to ask it. They're not going to necessarily tell you the reason why. Next one is the echo. So you can encourage them to say more. Perhaps they've said something you're not really sure you don't really understand. Repeat it back to them. They'll hopefully say some more. On a similar thread the boomerang. This is really good in user testing. If someone goes, they're trying out your app and they're like will you click this? Is it going to take you to a confirmation page? They're asking you a question. You're an automatic response to answer. Don't answer the question. Just ask it back. What do you think will happen when you press that button? Then they will explain. Any questions that come back at you? Throw it back at them. Silence. I'm really bad at this. As you can probably tell, I like talking and filling in awkward silences. The more silences you can give, the more other people will do that. The user may well say something that extra little nuglet. There are many times in user research interviews I've done where I use a technique where I'll just have my notebook in front of me. They've said something. They've paused. I don't just go straight to the next question. It sounds like there's something else there. I'll just write some notes and just look busy and then they just continue talking. Some real wisdom can come out there which is really exciting that that does work. This is the question about when you're just not getting what you need out of them. You can use the Colombo method. Would you ever use it? What about these fitness apps? Just get it out of them right at the end. The purpose of the research interview is to have some insights. If it's not going anywhere, they're not seeming to get the questions that you're asking. You can ask a more direct question but just maybe try the other techniques first before you get here. Sasha has given you copies of all of this stuff that you can take away with you and use as your cheat sheets when you're running user research interviews. You're listening for the reason behind the answer, not just the answer that they give you. Habits, behaviours, issues, annoyances, opportunities, all of that stuff is real gold dust. In the session that we're about to do now in the workshop, we're going to interview one user. Really, you would do more than one user, so then you're looking for themes and patterns across those users. I will talk about how many users in a little bit later on. There's some newcomers here. Hello. You've missed out on the first activity, but that's fine. What I suggest is that you join a group and be an observer in what's going to happen in the next bit. Let's try it out. Users, you've got your user stickers on. I want you to abandon... You guys haven't got your stickers on. Put it on, put it on. You're going to be moving around, so it's easy for people to identify. You're not going to be in your groups. The user from each group abandon your... Abandon ship. You don't need your assumptions with you. Move to another group. The users are going to move around. Yes, I'm here. Welcome, guys. The facilitators... The facilitators, you're going to do some question-asking. I haven't prepared you with questions. It's not going to be easy, right? This is just an experience of speaking to somebody who probably has not seen the topic that you're going to ask them questions about. Maybe hide them. Okay, so just go ahead. Let's do it for 10 minutes. Let's start with those broad questions. Get to know them. Start narrowing in on what you want to find out. Observers, please listen carefully and note where assumptions are correct or wrong. Okay. Everything is happening here. Where will you go if I do it? It's a different atmosphere. But I'll work it out. Maybe half an hour or one hour. Let's move it. I get tired and I have to go back on my own. It's just thought for a moment. But the problem is that there's not the question of why I have to come here. I'm also from the conference. That takes a few more minutes before I just get those 10 push-ups. I still love taking off an hour of work. Now I'm coming back. Now I'm a bullet in the wall. I'm going to prepare for the conference. I'm going to prepare for a pretty new bottle. I was checking it. You can tell me what you think. I'm going to put it on the floor. You would expect a perfect conference. I did have a try to change this. No, it's not a good thing. I'll be back. I know that people don't like it, but it's weird. The Japanese were like, Oh, there you go! Everybody does that, but I'm not going to talk about it. I got a little bit of a headache. I almost tried to... I got a little bit of a headache. I have to worry about it and have some questions. I'll make sure to get the right answers. Yes. OK, everybody. Thank you. Thank you for so willingly getting into the roles. I really appreciate it. That's nice. So we were originally going to do this, but we would all get back into original groups. I don't know whether it's too necessary. I'd love to go around each group. Maybe we would just share the experience. I'm interested, first of all, for each team. Did you discover whether any of your assumptions were true or not? What about you guys over here? Was it all surprising? OK, so our research question was about the music question. OK, so our research question was about the experience of contests at least trying to meet other people at that point and how to make the success better. So we assume that when people here are a beauty introverted and they compute their own interests, so they can be at this into their heads. And it would help if we helped them to organize a bit, organize a workshop, for example. One funny thing that we came up with is organizing some sort of speed dating for people. It's just a space with a given protocol how to... Did you find out in the interview? Yeah, so I feel like we're getting there now. Somehow, somehow. You're very close to reaching this resolution, I feel like. I said, I'm sorry, I'm an observer. There were a lot of assumptions that we wrote down, especially if we go with the reasoning behind it. We just discussed this and you felt that we touched both of them. But for me as an observer, I think we touched on me to go to six. Now, so for example, the speed dating, I mean, give also your reaction, you're like, what? Did you notice that it was like this? So I think that also if you're asking questions, you should really keep in mind your assumptions and see if you can try to validate them. Exactly, and this is why it's helpful not to do it by yourself, it's to have an observer to keep an eye because you need that person who's not trying to store the information in their head and also ask not-leading questions that can be quite challenging to do. So in general, I think that there are a lot of open questions, so it's very good. But in the end, to say, did we check our assumptions, I don't think we checked our assumptions. You didn't write, normally you would write a discussion guide, which is going to have the questions in it that will help address the assumptions. I just landed this on you, Colm, so it's not surprising that you've got it right. But well done. So especially in my handwriting, it was like totally kidding. What about to you back then? Did you have the same question, same research question about Def, attendees? Yeah, and then we told them. Yeah, okay. And were any of the assumptions right? It realized that as we were writing our assumptions, we should also be sharing those assumptions so that then... No, no, no. ...we could actually like know it's what the assumptions are. Do you mean sharing them within your group? Yeah, I see. Okay, that's fine. So I think like personally my assumptions, some of them were met, some of them were actually directly challenged. I was there on that server. So yeah, I think it's quite interesting. And that's good, right? But I mean, obviously you wouldn't then just take the information from one user, you would do this across many. And as you start to hear the same things, that raises the confidence very high. Yeah. And what about over there and in the back? So what was interesting was our research version was about how the buying experience of that kind of thing it is. And like one big assumption which we didn't even like done is that everyone bought the ticket just like here's the way. Let's start like just through the ticket page. But our user actually bought the ticket from another person. From a sponsor, they wanted to cover a sponsor fee by selling some of these tickets. So this is an example of getting out of your bubble, right? This is where we think everyone has the same experience as us, but we can be directly challenged by doing this kind of thing, yeah. The consumption we've all done were really a good fit for that user. But still it was interesting to find out. We found out a lot of things, but yeah. Great, okay. And over here in this team? Yeah. So I mean... What was your research question? So the question was what is the experience like for someone travelling to another country for a world conference and what might make that experience better? So I mean, first of all, something we have the experience is that asking open-ended questions is very challenging, especially if you don't have the experience. You will tend to ask... Do you, would you, are you? I'm here. Close or leaving questions. So it seems like a lot of practice and experience. And also I found like going with the assumption also not set forward because the user could also steal the conversation because it's like there's a human connection. You have some assumption, but the user could deviate. So keeping the conversation in the control and aligning with the assumption also I found it a bit. Yeah, that exactly. Maybe because there's no experience. I mean, you've come to not prepare to talk to this interview, right? So in real life there would be some preparation for this. You'd have a kind of a script of questions, but also you do have to be flexible enough to go off script because the user is telling you something absolutely fascinating. I mean, follow that. You know, go down that rabbit hole. It's worth doing it. But yeah, it's practice. I mean, practice with people you know. And the last point in terms of assumption hint rate, I would say we were like 50%. Oh wow. We were 50% like correct about our assumption. 50% we revealed that the user has different thoughts and different opinions about it. So we were not like completely... But it's all right now to be completely... Yeah. Yeah, it's totally fine for them to be wrong. It must be half of it, or maybe part of it should not be part of it. So I think we... Is anyone got any particular questions? So I think a nice example with us was one of the questions that was like, okay, what do you think would help you meet more people? Which is like a good sort of like open and question. It's a bit scary but in the right direction. And then directly thereafter was like the bias instead of like answering the question was like the bias. For example, do you feel the workshops are helping? So and then directly... Feeding him. Yeah, you're feeding him basically, okay, what do you feel about workshops? So and I think that's important like if you sort of like ask a question and there's like this silence, it might be because that person is speaking. So just like step back and have time to think. And then like if that doesn't work. That's the Colombo question at the end, right? So what about workshops? Do they help you? Right. You'd like trying to find us out, but to do it, you know, give, try the other options first, exactly. Yeah, yeah, yeah. I think so my main point is like be patient also in the discussion. I'm curious based on your experience, how do you analyse what you get here? So are you transcribing the interviews in the schedule? So the observer should take all the notes. I also take notes as a facilitator because it helps me kind of process parts of the interview too. Like I might like underline something. I'll come back to that things like that. You can then we'll talk about analysis in a minute. But it's a matter of plotting your insights across your users, right? So any kind of matrix like a spreadsheet is good for this. But then also we'll talk about this if you're, we have distributed teams. So you want to collaborate things like mural is great for that as well. Right. So thinking about where everybody can see all the information, especially if more than one person is doing the interviews. Right. Be able to put it all in one place. And actually getting the insights into that thing is the long, it takes the longest. Coming to the conclusion can be pretty quick. When you see it all there, it's sometimes it's like, oh, it's like one moment. Is it an advantage to record the composition? Yes it is. We'll talk about that in a minute. I just want to hear from the users what your experience was like. And then we'll ask them questions about how to do about this. So what was it like as a user? All right. I mean, a user researcher, so I could see some people. Still, I think it was, I was not led by the facilitator at any moment, maybe with this specific question and some close questions also were there. But in general, I think it was a good interview just to keep in mind the guy, I guess. I'm not sure. We knew they had the time to create one. What about you, user? I think I had a little bit of artificial perspective because I'm in this session. And I actually, I didn't fully process what was happening. I thought that they had the same topic. And so I was waiting to get those questions, which maybe is an insight about real users' experiences. They come with opinions sometimes. And they think they know what you're going to ask them and they're excited to talk about it. And this is like a completely different conversation than they thought to be. That's the purpose of mixing everybody up on this. You could tell that you were like, they're making conversations, just going to wait for us to get into this topic. She's already deep in the topic. Yeah. Great. Any other insights from users? What was the experience like? No? Over there? The user? Yeah, I was also surprised. I was ready for the question. Yeah. Just for saying more technically how I felt about it. Okay, great. So we hear a lot from people that they want a template on how to do user research, right? And it feels more personal and scientific because it's dealing with real people and people are messy. So often companies, and I've worked in some of these companies as well, they cling to tools and software and user testing platforms hoping that if they use the right tool, then they will get the right results out of their research, right? And there are loads of platforms and tools out there for doing this stuff. Companies are just building on demand, right? But user research doesn't need software to be done effectively. I'm sorry to tell everybody here that that's the case. So over time in my experience, my toolkit has diminished a lot. So I used to use lots of these platforms and stuff, but now it's just quite simple. So I'm going to talk about how we do research at consensus. And I think maybe we can just make this an opportunity for questions and answers, right, as we go through, right? So you might want some tips or ask for more information on it as we do. So, you know, if you can get the interview part right, then you really don't need to rely on any of the other tools and software because, you know, you can use a user testing platform, but if you ask the wrong questions, then you get garbage out. So it's about being able to do the interview as well. It's the first important bit. It's also the scariest bit and the hardest bit. So that's what people like to avoid and hope that the software does it for them. So we'll just show you how we do it here. So the first of all is about who to talk to and where to find these people. We get this question all the time. You then really need to be people outside of your work or your bubble. Being totally honest, Consensus is a really big company and there are still some projects who only test with other people in Consensus, right? These are not our end users. They might be, but how could they possibly be all of the users, you know? There's a pretty niche audience that you're building for. So finding people to interview is really important. So, for example, say you're building a product for DAP developers, like recruiting those people at a conference like this is a really great thing to do. It's a good way to do it, because probably they're already close to your target user and you have a conversation with them to establish whether they are really the right kind of person. And then you don't do the interview at the conference. People are not in the mindset for taking part in user research. They'll say, oh, yeah, I guess. And then they won't turn up for the thing because it'll be something else more important to just get the contacts, make the connection and then follow on with it afterwards. We're not paid but to promote them, but I love them, respondon.io. Also, uservitinterviews.com. You create a screen and you specify the type of user that you want. They go and find those people, present them to you in a list with how qualified they are. You can check their credentials, check them out on LinkedIn. They're real people. You can message them to ask them clarifying questions. And it's a scheduling tool for all your sessions as well. And they process the payments as well for the user. This is indispensable while successfully using this to find space enthusiasts in Kenya. Soft commodities traders in Brazil. Dap developers, people who use MakerDAO, you can find these people on these platforms. Give them a try. It's also free to just launch a project. You don't pay for anybody until they've completed it. So you can just test to see whether you can find the types of people that you're interested in talking to or not. Just give it a go. So I really, really recommend that. Snowball recruiting is a method where, okay, you found the perfect user. Ask them at the end, who else should I go and talk to? Get them to do the improvement for you. So just handling that on. And pay them for their time. There's two benefits to this. Some people say, oh, well, then they're not really interested in giving feedback and stuff. Absolutely, that's exactly what you want. You want somebody objective who doesn't care about your project. They need to just establish the terms of relationship. This is a transaction between your sharing your knowledge and I'm asking you questions. In addition, paying them means they will probably turn up. This is the biggest pain in the ass for a researcher. For your research participants to not turn up. You can't just magic another one out of the air for that hour slot. You've got your observers lined up. You may have booked a meeting room somewhere as well for it. So paying them like 50 quid or $50 for an hour of their time is well, well worth it. Yeah, cool. So one of the sessions here, we had previous sessions. It was always sort of like bounty. It was like $5 or something. And then you got into the discussion like, is that worth my time? So, and I've noticed like sometimes, especially with like different gift cards like this or paying people that they don't value this enough for being their portfolio. That's usually an issue. $5, no, it's absolutely not enough money. Of course, I agree. But still, I mean, even if someone would offer me like $500 and not do it. I mean, Google gave me like free laptops. I said, no, sorry, I don't have time for this. That's just a failure. That's just the reality of some people, right? You can't necessarily access every single person. So understanding what the incentive is that was going to attract that person is really important. I've done research with like CEOs of major, big or corporations. They don't want money. They want a charity donation in their name. So you need to think about what the alternatives are. Be aware, if you use Bounties, then you're only going to get users like literally the Bounties platform. You're only going to get people who use the Bounties platform, right? So it's quite small and $5 is not a lot of money really for that. Yeah. So I feel like this is something that varies in being in the universe to be the seat. Yes, it does. So I have more preference than B2B like before. But in B2B I find that people are more interested in doing these because there are a lot of new recessions because they want to learn about the industry as well and like get to all the latest stuff. Yeah. Where is training? It's really annoying because so when I've done B2B interviews I sort of have phrased it like, I want you to meet your expert opinion, right? People love being heard and they want to be considered that the industry is somehow going to react to their world, their pearls of wisdom, right? But yet it shouldn't be a two-way thing, right? It needs to be I'm researching you. It's not useful to you. They're just going to come back with loads of questions for you. So it's about establishing the rules of engagement, right? This is a research session. I'm going to be asking you questions about this. I pay people like $500 for an hour at their time depending on who they are, right? And you can find people in specific roles and specific companies through these platforms as well. So it depends. You have to think as well that there's always going to be a bit of bias as to why people are motivated to take part. That's just real life, you know? But speaking to a few of them is going to raise the confidence and the responses that you get. Any other questions about recruiting or paying people? Just be aware if you do insist on paying people in die or easy, you're only going to get people who are happy with that, right? So just consider that. Also, I don't do that because just on a logistics level, I couldn't get it financed to be able to work out how to receipt it properly. So just, like, you can kind of receipt it and I'm just going to get going. Whether you hate the company or not. How many people? So you probably heard five is the magic number for users, right? There's actually some science behind it, but it's quoted incorrectly. I'm not going to go into loads of detail about it now. Five, it's like how long is a piece of string? Five is a good number to start with per user type, right? It's not just five, one DAT developer, one crypto trader, one asset manager. You need to do five of each of those to understand the themes and patterns. But you really do it enough that you stop learning, right? No users, no insights. One user, loads of insights. Two users, not twice the insights probably going to be some overlap between them. So you're going to get diminishing returns. And I'm working on a study in the minute about DeFi users, people using Maker and Compound and things like that. And we've got to about 12 and now I'm not learning anything new, right? So that's really good indicator that I can stop now, formulate these insights and make the recommendations to the business. And it's about doing this little and often. So if you imagine that your company is like a ship, right? You've got your end pointers inside or on the map, right? But you need this constant course correction. It's much, much better to do a few users often than it is like 20 user study at the start and then never doing it again, right? Because you'll probably be well off course by the time that you get to launch your product. So just doing a little bit often helps correct and gives the people kind of per person alignments around what they should be building to. Any questions on numbers? Cool. Okay, so the setup, right? We've just done it in person. You can do it in person. That's great. There's pros and cons for both of these things. So I do most of my buy video call. The reason is that, like, we're a distributed company. I have a lot of friends from my home. And sometimes I get, like, Sasha. Sasha will dial in with her camera off, you know, and sound off and she'll be the observer, right? And I'll explain to the user, I've got my colleague Sasha on the call. She's just taking notes for me. And so it's not a distraction having, like, loads of faces popping up, right? So make sure the video's off, yes. So why would you have the second person in the call if you're already recording it? I challenge you to go rewatch your videos. I'm consuming, right? It's very time-consuming. Most people don't watch videos. I mean, like, almost. Cos it doubles my time. So I do the interview and then I would have to watch the interview again to make all the notes. But there's the second person who's watching it. So if you do the asynchronous, it's also... People don't watch up. People just don't watch the videos. I know what you're saying. Some people might, in my experience, the CEO is really interested in what to do. Send them links. They don't watch the videos. So the video is there as an artifact for me to rely on in case we fucked it up. In case I can go back and go, what the hell are they saying? My notes don't make any sense in this part. I can go back to that. You might make a highlight reel for people if you really want to do that. That's very time-consuming, but also very powerful as well. The observer will behave differently when he knows it is live, later that it is recorded. Absolutely. They pay attention. So when you're watching a video, are you ever just watching a video? You're on Slack, you're email, texting, whatever. The observer has to pay attention. Exactly. There's some positive to doing video calls as well because it requires less of the user to take part. They can do it at a time and place. That suits them. They book in review. It's easier for them to take part. If you require them to be in a location, you have to pay them loads more money to turn up. Also there are some problems with inviting them to your office. So let's say you work at a really cool tech company. It's all like really cool post-it notes everywhere and cool furniture and stuff. That can all influence the user's perspective of your brand and your company. You can book meeting rooms by the hour in anonymous places. I've done that before. Meet them in a coffee shop, I guess, as long as it's quiet enough for you to have a conversation without it being too disruptive. That can work as well. But video calls, they work. We've got global audiences, global customers. The days of the usability lab are well over. That's an old-school approach to doing testing and interviews. We don't need those anymore. There's better ways to do this that's much, much cheaper. It's doing this. There's a difference, though, between doing usability testing a very structured sequence and your eye-tracking and your seeing what button they're looking at before they click. That would be the old-school usability microscope compared to this is more subjective, fluidly, interactive dialogue. Is that correct? We're focusing more on the dialogue element, but eye-tracking looks cool, but that tells you what, not why. Eye-tracking says, oh, the user looked at that, then they looked at that, and asked some questions to elicit the why behind it. So you still need all that other stuff. I'm personally not a fan of some of those things. I did the neuromarketing course at UCL in London around this where there's movement into putting people through MRI scanners to try and understand their brains and their behaviours, and it's still guesswork at this stage. I went to the course, and they were like, just interview somebody, and you'll get the same insights. There was a place for usability studies because there wasn't an easier technology to do it. It's a trade-off, right? You might get the most in-depth usability report possible in a lab, but is that worth it than just doing a few users often? There's a trade-off, I think, to balance. This one comment is I've worked in user experience design and product management and so on, but I've also been a stealth subject for a big software company as I was the guinea pig. I went to the usability lab, people with the white coats and so on. I went through a sequence and I did my little thing and then said goodbye. Their product was more later on. I saw it a year later and it was just awful. So I get what you're saying. This is that over-reliance on technique and science and software. If we put an eye tracker on, they will know what's going on in their brains. It's not necessarily true. It's just being able to ask these questions as well. You had a question? Instead of using eye tracking, how do you feel about things like hotjar that makes heat maps and false mouse movements? A couple of things. Firstly, with hotjar, you need users. No, not like a user test, you actually need a large sample of people to be doing it. One of the problems with that is that if you're early in the crypto space, you may not have many users that can get a lot of insights out of it. It's basically the same problem as Google Analytics using that on your platform. I can also do that. Because you can have a session recording. You can see it's watching one session, which is extremely useful to do. So you have basically one user going to a flow and seeing how they do things. If you watch 20 sessions, you get a better feeling of how people are doing it. The real benefit of that is that those are real people doing real things at your site. User test, right? That is absolutely beneficial. The only thing I would say is that it needs supplementing because if it's still of what, it's still of what are they doing, not of why. The question is, because you don't understand their context, you don't know what else is going on in their life, I've watched some of those videos and they're not doing anything for a while. What are they doing? What's going on in that time, right? You don't know why they make pauses and why they click on things that they do or what they're looking for when they start. But they can be interesting. I think that this is like, that's quant data, right? Elements of it sort of bridges the... It's like you can dive into that data to see the granular level but also as a collection. I think all successful companies need to do both of those things. You've discovered behaviour patterns, which is cool, but they are starting points for your research. You can make your assumptions based on that information, right? You're like, oh, everyone goes to this page and then they do this thing, that's weird, but I wonder why they do that and then go and find out. Exactly, yeah. So I think there's room for both for sure. So the last thing is about... a couple more things is video conferencing, right? So we use Zoom. Zoom is generally easy, but the amount of times that I've had a session where the end user has not been able to use the software that you demanded that they use, right? It's just ask them what's easiest for them and you do that, right? It's going to be easier for you to set it up at your end to make sure that you don't spend the first half of the interview helping them work out where the share button is, right? So this is quite a good tip. It's just to go where they are in terms of the software, the tool that you use. Record it, but do always ask permission beforehand. So this comes to them. It's really beneficial to record it so you can come back, right? Don't just ask them at the start of the session, oh, by the way, I'm recording it. Ask them when you're recruiting them, okay? Because then they'll always say yes. If you land it on them last minute, then people don't like being surprised about that kind of thing. So it's a good idea not to just land it on them like that. You might have consent forms or NDAs. That's down to you as a company. I mean, I think some companies really overhype their own information that they think they're going to be sharing with the user, right? I mean, you're asking the user all in the questions. Hopefully you're probably not going to tell them loads of stuff that you're doing. If it's secret, right? So it's maybe a bit heavy-handed in some context. Some companies insist on it. Speak to your legal team if that's the case. Most of the projects I work on, we don't use NDAs. So the last bit we're nearly done, I think. So it's just about making sense of the findings. So you've got all these patterns and themes now that are coming out. You're going to create recommendations and actions. So you're going to come up with, oh, I think we should be doing this. And it's good to do this as a team as well. But what you need to do is prioritise them, because especially if this is your first research project, it's like low-hanging fruit, you're going to find out loads and loads and loads and loads of stuff you want to do or change about your product. In which case, it's worth plotting it on something like this. So this x-axis is like technical difficulty and not the user value or revenue. So what you can do with your recommendations is, as a team, decide what quadrant the recommendation goes. So if it's low technical difficulty, but really high user value or revenue, it's in the top left easy and important. Those are maybe the things you should work on first. Stuff that's high technical difficulty and high value important, but it's big. So maybe work on that in the background. And you've got stuff that's low value and low technical difficulty. It's not important to be working on that quite a bit. And then you can go to the CEO who's been demanding this feature here in the high technical difficulty, low user value, and say, look, we've plotted it. It's not worth doing. So let's focus on the top left things. So it's really simple. It's an activity you can do in like an hour from the research that you've done. And it starts getting all the team right, going, all right, that's the stuff we're going to focus on. You can even do it in that session. Who's going to work on what? Create your tickets for it, right? For whatever you're doing. So it's not rocket science. It's just getting to this point. And remember that it's not certainty. It's always about course correction. You're going to have new assumptions and new hypotheses that come out of the research that you've just done. Like, oh, God, we only thought of this. I need to explore this too. Research usually leads to more research. But I mean, I like that. Personal evidence. So, yeah. So we've got like 10 minutes left. The last thing I just want to do is just give you some what we recommend as great books. So the first one, Daniel Cainman's Thinking Fast and Slow. It's all about cognitive behavioural science, the cognitive biases that we have and why people do the things that they do. If you can read this, then you'll hopefully have a slightly better understanding of how people respond to your product and use a person and how you respond to the world. The Book in the Mivel is an excellent resource for learning to do UX research, especially if you don't have a researcher. And it's great. It's easy to read. It's quite entertaining and I recommend it. And then Steve Porticle's Interviewing Users is more detailed about uncovering insights through to young people on techniques. And the very last thing is we need participants and consensus research to do this. You get paid. So if you fit any of the following criteria at all, please email designresearchatconsensus.net or scanQR code or take you to it. It's really important that we get out of our bubbles. I guess this is sort of a bubble at Devcom. But this is the most bubbly it gets. But specifically looking for people in this region or who are DeFi users or who are app developers. You don't have to be all three of these things, right? Just any of them. Drop us an email and that would be really good. So I think we can maybe just take some questions in the last couple of minutes if anyone has any. Are you going to share this slide? This one. The whole deck. We can do, but I don't know how, what the method is for that. Maybe ask email. Sure. If you want to copy, then yeah. So, again, based on your experience and I've been working in Web2 when I was in Web3. And in your experience, what's different based on the old research methods that you can use specifically for Web3? Do you see a difference or do you see any nuances? I can go first and then you. So personally, no. I think the attitude from the teams is different. But the techniques are the same. The people might be the same as well. And we don't need to reinvent everything that we do, right? There are some tried and tested things that work. All tech companies are doing this. So what my experience is, and that's why we're doing this workshop here, is we've seen lots of crypto blockchain teams just not be down to trying doing this because it's not in their wheelhouse. They haven't had experience doing it before and they're of the assumption that it's blockchain, it's so brand new, no one knows what they need, so there's no point in asking them. And that comes to this problem. I know teams who've been building things for two years and haven't found a use case for it. What a waste of money and effort at the time. Thanks. Do you have a lead to that? First of all, thank you. Do you think it is still viable to contact user research at each stage of the product lifecycle? Yes, absolutely. We're encouraging you to do it early, but it's to, I guess, better late than never. Right? And also, we're in a bunch of products with really big users that are turning over 200 million a year, and we're doing testing all the time there. There's one big challenge. There's a lot of friction with project managers, with developers, with all those type of people. What is the best way you could convince the team that we really need to do that user research? I mean, how to ease. It's about the use for them. The first time is difficult, because people are, like we talked about in the vices, people are resistant to hearing feedback, or they're not quite sure how it works and what they're going to get out of it. My advice is definitely to include them. Don't just do it on the side, because that's going to make you look like you don't trust any of their opinions. You're just going off and doing some secret research, and then you're like, hello, I spoke to these users and they're like, fuck you. That's not a good approach. This book, the one in the middle, addresses this exactly. You can get this free online, it's like an E-Pub version. But buy the book. It's also about getting buy-in, absolutely. The most compelling experiences that I've seen are insisting that the person who is resistant to research observes all the sessions. Don't rely on them to watch any video, they won't watch the videos. You're the observer, come and sit in there here, and once they start hearing from people, they can't button, they're not allowed to say anything during the session, they're just going to be like, okay, we need to do more of this. But it's getting to that point, which is tricky. We're hoping that by, in consensus design, there's quite a few researchers, we're trying to do this with all companies, so I like to, I do a little bit of like, I've spoken to NOSIS before about doing research to help them understand some of the benefits of doing it or that they're already doing it, so that's good. It's just about education in the industry. And I think a lot of, sorry to interrupt, I think a lot of teams are, lots of people made lots of money in ICOs, and it's got nothing to do with their ability to run a company, and as a result, they're necessarily asking the right questions. So I'd like to add to that, I've seen a lot of companies where, first of all, the vision is not clear, then the strategy, like the main key objective, the main key challenge, the strategy how to get there, and then if you go further than use cases are also like non-existent. We're building very good. Yeah, it's really cool, but cool isn't a reason to do it. I've seen this happening around me now, the officer during this conference a lot, and I feel that it also has to do a bit with the immaturity of the level of people. Exactly, yeah. Which certainly makes sense to me, because a lot of young people. So we're going to create a consensus academy, like modules on doing user resources, and we're going to try and get out there and find people so that they can just try to learn these skills themselves. Actually, as a UX researcher, your job also is to educate people, and educate your team, and tell them why it's so important. So you can pass the lesson. So I just wanted to double down on one point. You said if you have a resistant, or a resisting team member, you can put him start as an observer. So that observer is like the most comfortable position among the three. Exactly, and they don't have to do anything, except take notes, and then just listen and take notes, yeah. That is also very difficult to do well. True. Advice on note-taking is when in doubt write every home, just write it all down, every single thing that's said, because then you can interpret it later. Take raw data. Just take raw data afterwards. You interpret it. It's like verbative. But after positioning them as an observer, do they turn? Do they ease? At the beginning, they are very resistant, but after being observers, they change? It's not the rest. Maybe it is not exactly. My advice is to get them to listen to, to or more, because one of the things that they can do is just latch on to that one user than they could. We need to change everything, because it's a really good session and my user is really verbose, but of course that's not user research. You need to speak to a few people, so just more than one. I'm very in mind the confirmation bias. They might launch to what they want to hear and show you. Do you remember that user? At least I want to listen to users as a start, even if they're hearing what they want to hear. I think it's a really good start to get them interested in doing the sessions too. Any other questions? They're learning from me in this conference, which hasn't been as... This is my first consensus, but maybe another crypto conference that the audience has been a little bit broader. Particularly when it comes to D5, a learning for me is we're actually not building for people in this room, even though a lot of people think we are. What's here, what works for you guys in terms of finding people around themes, general consumers that might be morally in favour of using products that you X, Yn Sef, as close to others, or recruiting people around themes really. I've been interested in income groups. Ignore demographics. Demographics are not really overrated. They're proxies for what you're actually interested in. Who cares how old that person is, and how much their household income is, if they are behaving in the way that you're trying to find. In platforms like this or with any recruitment, use a screener, a screener series of questions to determine whether they are the right person for you. For example, if I was finding DeFi users, this is an example, people who use Maker Down, I asked them first of all, do you have crypto currency? Yes or no? Starting broad, getting more specific, just as they do in the interview itself. What coins do you hold? Allow them to type them all in, so you can tell whether it sounds like they do know what they're talking about or not. Which of these things have you used before? A long list of all kinds of different daps? There are the DeFi products in there and they have to have selected one of those in order to qualify for the research. That's the behavior is, have they done this thing in the last six months? That's what I care about. I don't care how old they are or what city they live in. That's not really very useful information for you. Anything else? One thing is an issue is I think user research is a great primary source understanding why digging down but also there's the online analytics the runtime analytics and you need both. I think that's often both equally valuable. The question is how do you work with that function? Is that a separate function? In lots of companies it's totally siloed and that's really frustrating. I had to build relationships with the data analytics teams in other companies I've worked at. Probably you're on a smallish team that it's not such a problem to be able to reach out to that person but we should be working together. Don't really see why that hasn't been the case. Probably if you're working in crypto well it depends on what your product is. If it's a brand new product you haven't launched yet and analytics about it anyway. 100% they need to be working together. You need to inform your assumptions your hypothesis from that from what you're seeing what countries people are visiting from and where they click and how often they dwell on things like that or things like hotjar as well and they can help form your hypothesis but it won't tell you what the meaning is of the user which is the thing you can improve on you can build on to improve your product. OK Just drop us an email if you're interested in taking part in research. Come and talk to me afterwards if you like. I'm only around for an hour because I've got a fly to avoid typhoons. Thank you so much.