 capture the QR code. You may also want to jot this down. So this PDF will be available for quite a while? Yes, it's not PDF, it's a Google Docs. Yeah, it's a Google Docs. It will be available for a while, but most highly after this session, we will revoke the access. And then it should be accessible through the phone. We're not printing it because if you print it, it will be as this thick and we don't want to waste paper. So we only waste paper for ourselves? Yes. We recycle paper. Yes. So just give everyone a bit of time to capture this. Any issues on the handout? No? Okay. I still see some phones up there. If the QR code doesn't work, maybe you can try the link. We do have feedback from some friends who we tested this workshop on. Sometimes it doesn't work because we don't know why it wasn't on our phones. So you might want to just copy down the URL as well. Okay, I think we can move on. Okay, so moving on. Starting with a very brief introduction to your XP search. Yeah. So we'll talk a bit about user experience. User experience is experience. It's not design. It's not visuals. I will show you the easiest way, the easiest example of user experience that most of you guys would have experienced. So we start with the onboarding of Singapore. You would go to Singapore. You would arrive in the airport, which mostly would reach the conveyor belt after your luggage D in the conveyor belt. So usually the conveyor belt will give you the luggage faster than you reach the conveyor belt, which is a very good experience in the airport. And then the next thing you can do is the first time we're going to Singapore is that you can go to the coolest small probably currently, which they have a fountain and then for its inside the mall as well, which is just few steps away from the airport. And then once you've done with your first experience in in jewel, you can go to the city, which the MRT is just downstairs from the airport. And then you would go to the cab and then you would take the ECP and then through the ECP, you will see the Kalang River on the right. And then you will see MBS on the left, which is the million dollar view. So user experience is more like experience rather than just design. It is something that humans are experiencing. And in SP Digital, we always start with the people we were designing for. And then we end with a solution that meets their needs and pain points. We probably have two guiding principles in SP Digital, which is first one is consumer centered. Everything that we do, we always start with the users. It doesn't come from whoever from the senior level tell us what to do. It's everything comes from the user and more of the user problem. The next one is data driven. Everything that we do, which always try to measure based on data. So research is not always about qualitative, but it's also about quantitative, which I'm going to get deeper on the last section as well later on. In SP Digital, we use lean, which we start with learn. Actually before learn, sometimes we will gather some ideas as well using design workshop. And then after a design workshop, we will test the designs using testing with our users. And then we will build the product. Once we build product, we will use GA to measure the features of the performance. So because we believe that by measuring it with numbers and quantitative, it's the easiest way to measure our success. So we talked a little bit about user experience process. That was really a very brief introduction. We talked a little bit about user experience and all of that. So now let's move on to talking about research. So the research term itself and style of development is actually quite broad. So over here, we mapped it out to market research, from market research all the way to user research. So if you think about it, on one end, there is market research. We deal with market sizes, trends, regulations. These are all the numbers that need to be sorted out. And then on what we are concerned with today, for today's workshop is more of the user side. What we do is deal with like specific user needs, workflows, pinpoints, usability benchmarking, all this kind of stuff. So when you talk about research itself, research is a very, very broad term that covers all of this. And we mapped it out on the spectrum from the macro to the micro level. Also because we want to express the idea that when it comes to research, market research and user research has to come hand in hand. It's not as if like, oh, the market research, people just do that. And that's it already. The user research, people just focus on user-valuable benchmarking and that's it. But we actually have to collaborate pretty closely together. And we find that when people work together, when people on this tool and the spectrum work closely together, you get a lot of cross-pollination of insights. So why do we define user research? User research specifically is about understanding user opinions, needs and motivations so that we may be able to build the right thing right. I stole this from somewhere. So I forgot the sauce, but I wrote this down. So please do not attribute the whole quote to me. So I just wanted to make it here. So for today's workshop, we're focused very much on user research. And when it comes to user research, there are two main kinds. There's qualitative user research and there's quantitative user research. So on the qualitative side, what are we dealing with? Again, we're dealing with a lot of observational kind of insight. It depends very much on the interpretation of the designer or the researcher or the PM or whoever is conducting the research to interpret the data that is being found through the research. On the quantitative side, we're dealing with how much, what exactly is happening. We're dealing with a lot of measurement. And for SP Digital, because we use the lean process, we basically have this measurement cycle towards the end. We try to define what it is that we want to measure towards the end of every two-week cycle and then we measure to see whether or not we're on the right track. So for today's workshop, we're focusing very much on qualitative user research, especially on the interviews itself. But we'll touch on a little bit about quantitative research, especially in the last section. So that's it for a very, very quick introduction. Now we'll get to the more fun stuff. Okay, so when we talk about user interview, user interview is basically a one-on-one session with our users in a more intimate setting, giving questions, behavioral questions to understand their needs, their ping points and their expectation. So I will show you a week of our research. Typically, I want to show this, the reason I want to show this is because when you're probably trying to Google like research process or like day one to day five, usually they go very high level. So I'm just trying to say that that is not always the case when you do research. I will show you some of the things that we do every, when we do research in a week and it's not always very tidy. Sometimes it's very messy. So we will start probably on not first day, maybe day zero, Thursday. On the Thursday, probably like we will talk to, we will identify the product and then we will identify the topic and targeted participants. We will start crafting a research plan and then we will make sure whatever our understanding is aligned with the POs and under other stakeholders as well. And then the next day, most likely there will be a design workshop or design spring. It doesn't have to be through design workshop. It doesn't have to be through design spring, but typically we do a lot of design workshop because we have so many stakeholders. So when we do design workshop, we always try to do our work before our workshop so that after the workshop we can just execute everything. But we always try to make sure that on the day when we do workshop, we at least almost finish the planning so that we can execute on the next day. And then we will have a design workshop. We will blast the recruitment form as well based on the targeted users that have been aligned. And then the next day we will have the designers prototype the designs for testing that is produced from the design workshop. And then as researchers, we will schedule the participants based on the submissions from the type form. So the reason we blast the type recruitment form typically on Friday because so that we have two weeks and not two weeks, two days on the weekends to get some submission from our user pool. And yes, on Tuesday we will start the interview. And then when we do interview, typically we will interview seven to nine users. So typically in one day we will interview four to five users. It's tiring but I think it's doable. And then like while we do the interview we will try to clean up the findings as well on the go while doing the interview. And then the next day on Wednesday we will do the next round, the next day of the interview. So probably if we do four, then we will do three on the Wednesday and then we will do clean up findings as well at the same day. And then on Thursday we will clean up the findings. We will synthesize and analyze the insights, craft the research report as well at the same time. So we will do a Friday on presentation on the report and then not forgetting blast thank you email to the users. So it has always been never, it never really structured, it's quite always quite messy. And then also I would say that I wouldn't do this alone. Sometimes I would do it alone but a lot of the times right now I have she in or I have another colleague of mine who is amazing in doing research chops but she can't be here unfortunately. But basically this is how we do and then we always try to get feedback or findings to be reported to the stakeholders by a week. So we make sure that the delivery is on time as well. And then yeah. So I don't know about y'all but when I first joined the company and I saw that there was a standard schedule I was a little freaked out because I'd never done so many interviews and two days before. So just kudos to nothing but I take care of them. So thank you. Yes question. What do you mean by clean up findings? We will run you through on the next session. Yeah. So before I forget, one thing that we want to try out with this workshop is that we noticed that I am personally a very introverted and quiet person by nature and nature. So if you have any questions as well and you don't want to like interrupt people you can also write it down on a post-it note and pass it to any of the facilitators. Can you raise your hands? Yeah. You can give it to the facilitators over here. Okay. You pass it to the facilitators and you'll pass it to them. So we'll try to answer all of the questions towards the end of the sessions as well. Or if not you can try to find us towards the end of the session. I just want to try this out so that everybody can ask questions even if you're a little shy. Yeah. Okay. Anyway, jumping back to the schedule we'll talk about the cleanup of the planning paper. So jumping back to the research topic. Earlier on Nani mentioned and on the Thursday before that she would decide on the research topic. So how exactly do you decide on the research topic? Now for this I actually think you can write an entire article or run an entire workshop about it by itself. But because we don't have the time I'm just going to preach through it really quickly. So for me personally I think that actually deciding on the research topic is an art and science by itself is really difficult. If you ask me what I want for lunch I don't even know what I want for lunch. So how do I figure out what the hell is it that I don't know when I want to figure out? But really finding out the research topic voice out to one simple question what do I want to know? And that's the fault with the animation. Don't mind that. You get a preview first. So finding out the research topic can start from a lot of different places. I tried to Google this online through my own little homework to figure out how people were deciding on the research topic. But I couldn't find any good resources on that. So it's like definitively this is the brain work which will land you on the perfect research topic. So through our own experiences we found that figuring out what to even begin researching on may come from number one hypothesis from the roadmap. For example let's say your company founder or a product manager or a product owner comes to you and say next quarter we are going to build this thing. And you're like I'm not too sure about it that's prime for research. Or let's say you know that your company vision is this thing and you know that you want to get here but you're not too sure about how to get there you don't have a clear on that. That's also prime for research. So your research topic may also come from previous rounds of research maybe you've been speaking to a lot of people but you're getting more questions and you're getting answers well that's your follow up research. Otherwise yeah otherwise let's say oh maybe the team has done some blue sky explorations concepts and they want to test and validate it that's prime for research you can get a lot of people and test them out figure out whether or not the concepts actually work whether they make sense. If you have an existing product that's been around for a while people have been adding a lot of different stuff to it maybe the product has a certain age you might want to do some benchmarking on it to figure out like with all of those different teams working on it does it still make sense does this product actually still make sense? And then number five is oh you haven't again you have an existing product you have a support forum or maybe you have a Twitter account and people are complaining about your stuff on the Twitter account you want to investigate more that's prime that can be a research topic as well that's the animation part so after coming up with a research topic then you need to figure out like okay how exactly do we go about finding out the method that we want so from research topic to method it actually really starts with this first off you start off with what do I need to know we need to first understand what is the question that we are asking ourselves what is the question that we try to find the answers to and then the next question to us is what data do I need to answer this question let's say we are re-ramping our product we want to redefine the information architecture of this what information do I need to be able to confidently redesign the information architecture of this site oh let's say I have a site navigation and I want to figure out what should come first do I need a rough sense of stack like rank preferences or do I need definite sense of rank preferences this will tell you whether you should be going with maybe just like quick interview or maybe you should be going with like a more a more robust card swapping exercise so you first have to figure out what exactly it is that you want to know and then what data do I need to answer this question is it qualitative is it quantitative is it something that we don't need to be very confident about is it something that it's very difficult to find out what is the cost of finding this information as well and then once you have that then you need to figure out like how do I get this data so I need I need rank preferences then how do I get rank preferences do I get existing users do I just go on to the street and find random people to do a quick career life as do I blast this out to many many people do I recruit people in and why not so the method is determined by the research topic that you want to do research on not the other way around when you talk about the research how to do the research there's a lot of research methodologies there's qualitative there's quantitative inside qualitative itself there's a lot more different methodologies from user interview ethnography diary studies you name it for our team we focus not focus but we do a lot more user interviews compared to others because we want to focus to get feedback as fast as possible from the users before either before after launching a product so that we can measure our success so that's why we will focus a lot more on user interview because we believe that user interview probably not the best way to get the most results but you will get the most results for the time that has taken so when we do user interview we will start with the planning for our team and then these are the things that usually you would include in a user interview plan for those who's taking pictures don't worry we have it everything on the handout so and the details as well inside it whatever I'm going to talk about everything is a handout so usually we'll start with introduction and overview because the research plan we will always attribute it to our stakeholders and our stakeholders will always start looking into the interview whether it's a PO designers, engineers whatever all the stakeholders will look into the research plan usually when we do the research plan when we see when they see the research plan not everyone will understand the context why we are doing this interview why we are doing this research plan so typically when we do when we put the introduction overview we'll put like whether this comes from design spring design workshop why we are building this why we are doing this research we will continue with the objectives which is like why you want to do this research and what is the expected outcome of this particular research we will go to the participants as well and then this is more of the targeted users for this particular research maybe not so much on the targeted users for the app but it could be a subset of the targeted users of the app more specific ones but targeted participants should be something that is aligned very well with the PO and your stakeholders as well we will go with the timeline which usually would include of the timeline of the research like when is the expected research report for example and then what is the schedule of the sessions user interview sessions in case the stakeholders want to join or watch the session usually we would have like a Google Hangout and then like we will blast the we will give we will cast the interview and then everyone in the company can watch the interview as well and then we will start with the questions which is usually we will start with discovery questions which goes back into the user's previous experience and then we will go in with the design questions which intended to evaluate the designs and then usually is the next steps which we usually will include like what is the measurement plan so like for example when we launch this product and then what are we going to include it on GA what is the tracking plan for the GA what will be included for the measurement plan but whatever we put over here I think you can include it whatever makes sense for you as in like you don't have to include everything in my previous company I don't include everything as well because it's a smaller company but because this is a bigger company and then there's a lot of stakeholders there's a lot of stakeholders that it's not even inside as per digital it could be outside as per digital so that's why we always try to make it very comprehensive so that people without even context can understand our research so now that nothing has gone through the research plan let's talk a little bit about recruitment I had this interesting conversation before this workshop with one of the volunteers about recruitment so I didn't want to do a quick poll how many of you have done recruitment before for research how many of you have no issues whatsoever doing recruitment it depends I need to learn from you last time these teach me so recruitment we obviously can't do qualitative user research which is about observing users without actual users so recruitment is a big part of qualitative user research especially for interviews so it's still a challenge for us doing recruitment it's slightly different between nothing and myself so we deal with different business particles so for nothing I'm envious for me it's a little let's not go there but anyway we'll just talk a little bit about how we generally do recruitment and a little bit more to the context of why I say she's better primed than me right now so sorry for all the animation hiccups let me just go back okay I'm just going to show you everything the number one thing that we actually have in our situation is that on the consumer side we actually have quite a big user pool which we often contact and reach out to especially when we have research so like what Nani was saying she blast out the type type form to the user pool a week before the actual research takes place so we have around I can't remember the numbers I think it's now I think it's about 60 to 70 people and it's not a week usually sometimes we would do it on Friday and then by Monday we would have submissions already yeah so so what we've done for the consumer business side is that we've actually built up a user pool where we can which we can easily tap on at any single point in time and say like hey we have some research are you able to come down to our office at this particular point in time so that we can do a one hour interview with you and yes you'll be paid for your time and you're also there and so yeah we often we also have all their relevant characteristics like for example their age yeah of the area that they stay in their maybe their occupation as well and their tax havenas etc so that we can easily know like okay for this particular feature that we're researching on who should be the people that we're targeting and how do we actually reach out to them so we actually have the user pool then and that speeds up recruitment a lot so having a user pool speeds things up a lot but there's also a risk in that if all of your user pool have the same particular profile or background you may get bias inside so there's a pros cons of that how do we even get user pool in the first place that's a different question we have more details in the handout as well but another method that we often rely on is to hire a recruitment agency so one of the recruitment agencies they're not paying us to see this by the way not endorse not endorse there's great army so in my case I deal with B2B people like I sell my business vertical has to do with business to business software so especially for example I need to speak to building managers for example this building they're in someone is in charge of it someone is in charge of the utilities and making sure that everything is functional how do we recruit building managers the profile of building managers generally they're not very tech savvy they don't trust you most of the time you approach them and they're like why are you asking me all of these questions get away from me I don't want you to know all of this information you will get me into trouble so there's a profile that we're dealing with to recruit people and the only way so far that's been the most accessible is to either rely on sales either rely on account incentives or rely on external help like recruitment agencies such as your ex-army which has been a huge relief for me actually so the third way of getting people for research would be support feedback request tickets early on I said that oh you may have a lot of feature requests coming in forums or help sent out these people are your ideal research participants someone is complaining on your Twitter account pounce on them grab them they are ideal for a human they hate if they hate your product so much but they're still on your product speak to them they're going to give you so much insight into why your product is not in as well as it should and the fourth method that we've also found successful is to go to events and locations where your target users were frequent so let's say you're designing or you're like selling maybe to new parents you would want to go to one of those expo convention events where they are selling products baby products and you might want to just be outside and just stop a random pass about food as if they are new parents and ask like hey can I have 10 minutes of your time and tell them why we should join our research and what benefit they would get out of it or let's say you're trying to target like HR professionals then you want to start stalking HR professional HR conferences so right now because I'm building with building managers I've learned so much about managing a building in the past six months oh my goodness anyway going on so yeah going on you have the recruitment then what about communication sorry for us I think eh? okay for us I think we again I think the goal also not the goal but one of the things that we really into when we do research is we try to make sure to deliver the insights as fast as possible and we realize one of the things that takes the most of time is more on the research ops side of things rather than the interview the conducting the research itself so other than for example having the user pool one of the things that we do as well is we templatize all of the emails that we have which you can see in the handout on page 13 one of those are one of the things that we have on sending emails to our users what we have to include is that who we are the topic of the interview session when are we going to do it and then detail time and date what is it for the user what's the value of the user whether we're going to give cash for example how much is the cash whether the session will be recorded or not we even put some details on our email templates for stuff like our meeting room is extremely freezing so you might want to make it you might want to bring jacket because like you'll get distracted because it is extremely freezing so yeah we templatize everything as you can see over here on the page 13 is our template for our consumer and then the page 14 is more on a template for a B2B email template go next the next thing that we do as well is that we also try to make sure that we cover the governance the governance as in like more on consent form having a proper governance or like having a proper consent form it would assure that the participants understand what they are signing up for that they understand that the details like whether this is recorded or not whether what kind of things that they will talk about and then your research is ethical and then complying with your regulation in Singapore is more on PDPA which recently just I think like the government is trying to do more to look more on PDPA and then everyone in the space right now is doing migration from IC to login using email but yeah PDPA is something that you have to look at and then on the consent form typically what you need to cover for example first one is about yourself who are you whether you are a senior experience researcher whether you are a designer whether you are a product manager and then who are you representing for you know whether for myself like we are representing SP Digital SP Digital is a subsidiary company under SP Group the next one is the research itself the research is like the purpose of the research what are we going to do of the research whether we are going to record the data who owns the data what are we going to do with the data when we when we present the data who are who are the people that we're going to share the data with and then about the participation itself that the participation they can stop or with or at any time and then the participant cannot talk anything that we talk about in the room because a lot of times when we do interview we test the designs as well when we test the designs we test concepts that haven't launched yet as well so we have to make sure that the participants won't talk anywhere else outside the room but all in all as in like even though I've told you like these are the things that you need to include in your consent form what we do is talk to your legal team talk to your legal person legal is your best friend talk to we had like two hour session talking to legal after me working in SB probably like one year one and a half years and then I realized the whole conversation is extremely enlightening because we realize like how to create consent form and then like what matters to legal and then like for example we can't really call to the user because the user in Singapore can put so stuff like that is extremely important to be looked at and so depending on like where your target interview is you may have to deal with other compliance stuff which is very fun like PPA NPR PPA so it's a cool term for it very fun anyway so moving on we in asking to Judo we also have a standard five-day interview so it's kind of a framework that we follow it's a framework it's not a hard work we must follow all the time so what this five-day interview does for us is basically provides a guideline especially for not so seasoned researchers to be able to conduct interviews by themselves as well and it's for us to help communicate to the stakeholders what actually goes on in an interview so what our five-day interview consists of will be the first welcome discovery usability testing concept testing and review similar to a lot of things that we just shared not everything needs to be made add-on things remove things etc so I'm just going to quickly go to what each of every single one of them is about so number one welcome I don't know about you but if some random changer tells me to go down to their office and sit there for one hour and he or she is going to ask me a lot of questions for one hour I'm going to feel super nervous and like no myself so what a welcome does is that we welcome people into the office we try to make them feel comfortable usually we provide food we found that food is usually helpful and making people relax a bit we put them in a comfortable but extremely cold room we can't help with that and we try to establish like okay what is this session about this is what's going to happen we're trying to make them feel at ease nothing is going to go along it's very safe I think like more like a therapy room so we put like couch chairs and then like we have like carpet and then like we have some plants as well and then for the food we're not putting just some candies we put like really really good muffins that is known as the best muffin in Tanjong Pagar in Singapore so yeah we try to make them as comfortable as possible talking to us everything that makes you feel like you're at home so after the welcome we then jump into the mall we jump to the meat of the interview we go into the discovery questions whereby we ask them about their habits their problems not directly of course but a lot of raised questions that we will touch on later so we go on more today like what are the existing work loads what are your existing challenges how do you come through the show me etc and then after that following the discovery questions usually we have either a usability test or a concept test or both whereby for usability testing we try to test whether or not certain prototypes have any usability issues if we are exploring more blue sky concepts that we're not even sure would work or not we would do a concept test to just try to elicit a response from the interview to see how do we feel about it we even understand the concept of the notion and then after that we do a debrief we thank the interview for their time we give them any tokens or appreciation or incentive if it was promise we wrap it up we ask them do you have any comments for us any other questions usually the bonus questions or bonus comments are very insightful as well and also because recruitment is so challenging we ask them would you be willing to take part in future sessions and also do you have any other friends family members or colleagues that you would recommend for future to add about that particular question so we always ask by in the end of the question like whether you are interested to join our next interview session and whether you have friends or colleagues to our interview session it is also our retro process so on my first few years of my research I've got into when I asked the question you would probably think people would say ah yes yes sure obviously it's $50 so why not but I've got some people who would tell me no because apparently the whole experience in the research is not a good experience and they don't want to go through that research again so it's also a good way for us as researchers to retro ourselves to see whether the experience is comfortable for the users and whether they would like put yes to the same experience yes so we've talked a lot about the introduction to interview so like the plan the recruitment et cetera et cetera now we're going to go a little bit more in depth into like what's the prompt of the question okay so when we talk about good questions we always try to ask the past experience or existing we don't really ask about future experience because that would be hypothetical so we always try to the type of questions that typically we would ask about like tell me about when you probably like buy this particular product or tell me about the last time you use our SP app what happened what did you do next so we're trying to go into the previous experience although we always trying to get it probably less than three months because more than three months mostly the memory usually would put we always ask go to the previous the previous experience because the previous experience is the easiest way to validate whether the user would go through the same experience again in the future if the user has bought something in that particular price then that would be probably the gauge of the price that we would want to go we were always trying to prioritize open-ended questions we sometimes we use called close-ended questions for the purpose of directing the conversation but we always go for open-ended questions like what do you think and then sometimes I would ask a particular answer that they give so why do you answer it that way why do you say something something so we always trying to go for open-ended questions we always ask naive questions as well and by asking naive questions I think it doesn't go just by asking the questions but also your expression also shows that you are interested you shouldn't have any judgments to your user and then a lot of the times I would ask the user why the thought process is like this why do you think that way why do you think this is not good so we always try I think assume don't assume and then you can always ask naive questions as well to the users we use 5-wise as well we don't really put in the what is it in our plan like we put like why why until 5 times but there are I think it's more of a mindset when you see something when you hear something from the user and then you think something is interesting you ask why if there is a problem and then like you go into the root cause and then you realize this is a problem this is the pain point so yeah we often go we often go use 5-wise although it's more of a mindset like I think trying to be interested as much as possible to your users yeah we also ask the users to think out loud when we as a user to think out loud we a lot of the times when we start our research we will tell the user my typical welcome to the user is that okay so in this time we're going to talk about this this this and then we will ask some questions about this we're going to show you some designs when we show you some designs we need to think out loud when you confuse when you think the design is bad when you have any comments about the designs please if you're a researcher so your comments won't offend any of us so we always try to get the user to think out loud okay now I'm going I was talking about the good questions now I'm going to go on with the bad questions bad questions typically a very good example of bad question is leading questions so for example like you like using that future x don't you or why did you have a difficulty with task x we consider this as leading questions because usually we always go back into and see first whether the user actually using the future x or not so you like using the future x don't you so it's more like validating whether the user is actually using the future or not and it's leading the user assuming that the user is using the future so we always start whether they're using the future or not when they use the future when was the last time they use the feature like how was it how was the experience what did you do why are you using your app and then like how would you rate the experience with the future x and then how can it be higher so we don't assume as well that they have problems with the future we always ask like how was the experience and then how would they rate it how can it be higher and then we can see some improvement opportunities from that particular future hypothetical question so this is an actual question one of our stakeholders one of our stakeholders ask the user can you ask the user whether the user would change their appliances after 15 years for example I would say this is a hypothetical question because like sometimes they haven't even replaced their washing machine or their home appliances or else like would you go to the store if you want to buy a TV they haven't buy a TV so why are we asking the question like whether they would go to a store whether they would where are they going to buy the TV if they haven't go through the experience in buying a TV so the typical question that we would ask instead of asking this question is that for example when was the last time you replaced a home appliances and then typically I would say oh just last month and then like what was the device I just replaced my TV what happened what did you replace it so that usually it would say oh spot so I talked to seven users all of them say spot so the whole thing the hypothesis is about reaching 15 years old and then they want to replace the device because it's old apparently it's already invalidated most of the people that we talk to they replace the home appliances because it's spoiled or broken what happened why did you replace it what was the new device and then like we want to see whether they replace it with the same thing or not because a lot of times like oh my Google home spot and then in the end they bought Alexa for example and then why did you buy this new device and then how did you buy the device another assumption which another example of that question which I find it as a researcher is very natural as well I used to have embedded assumption I used to have personal bias as well for example if there's few designs or few concepts that is produced it's very natural for us researchers that we have a preference but I think that's our job to be as neutral we always try to make it when the question might be asked like what are the problems you face while using the app it's you have this assumption or embedded assumption that user has problem using this app because you feel like the app is bad and then like you you immediately assume that all users would think that this app is bad which I think is a very dangerous assumption so again we always go back to the previous experience when was the last time we used our app? Why did you use our app? How did you use our app? How would you rate the experience with app? How can it be higher? So nothing talked about good and bad questions so we also have some pretty standard questions when it comes to testing of design concepts and so these are actually all standard questions that we ask so one way of testing design prototypes is to give people a task and see whether or not people can actually complete the tasks another way is to have standard questions like this and the reason why we have this is so that anybody can quickly go out and do quick breakdown testing or like quick design concept testing for people so these are the questions that we standardize or emphasise for some of our prototypes would be is this what you expect to see? What do you think this is? How can you tell what goes through your mind when you look at this? What are the information are you looking for and what would you do next? Don't mind the grammar errors now I've pointed it out everybody's going to try to look for the grammar errors anyway so this we found that having standard design questions standard design concept questions was also helpful for anybody to quickly test their prototypes with colleagues or with random strangers and a lot of people in whatsoever so yeah this are some of our design concept questions now moving on to our favourite quiz time so it's been a lot of stuff that we've covered and so now we're just going to do a quick little activity and ask you all to read questions so we're going to show we're going to give you a scenario and ask you to read whether these questions are good at that so if you think that these questions are good just raise your hand if you don't think these questions are good you don't have to do anything just keep your hands down now as a researcher I understand that that instruction itself is actually bias because me as an audience I would just like I don't want to do anything because if I'm wrong nobody will see that I'm wrong but anyway this is not a research study so we're just going to do a quick quiz on this so the scenario okay the scenario is that you are the researcher or a designer for a well-known online retail shop never mind which one and you're trying to figure out whether or not this design concept actually works on people so you want to test you want to test this design concept of people whether or not it works are there any users who have issues for example so I'm going to give you your colleagues or team members have come out with certain questions and asked if users and you as a researcher are going to help them review the question so again raise your hands if you think it's a good question don't know anything if you think this is not a good question okay so number one what do you think this is this is a good question to us for that question and the answer is yeah it's pretty okay it's neutral it's open and lit and it will help to review the user's first impression of the design concept Sheen wait this one I think is a very good very good question because when we produce concepts there are a lot of times where I've been into a few interviews when I ask this particular question about what do you think this is apparently the user see the feature nothing like how we see the future so this is a this is a page and then like this is a particular feature that we're trying to highlight and trying to test the user and then apparently we show the user it's not visible at all to the future so yeah this is something that we're trying to see first like what is the first impression and then like whether the what kind of perspective the user see the feature or the pages next one would you buy the products that are recommended yeah this is this is a pretty leading question it assumes that the user will buy use the recommended section to buy the product it assumes that they will buy the product there's a lot of assumptions picked into this one yeah wait so and also this might simply be to ask them what would you do next and if they actually say oh I would buy the products then okay if you might want to give them a little more how would you buy the products can you show me next one if you want to add to cart which one would you click good bad yes depends on context yeah yeah so if it's out of the blue it's a very bad question because all of a sudden you're just asking people like hey you want to add to cart which one would you click on what is the thing that you would do so if it's out of the blue then you're assuming that the next thing the user would do is actually to add to cart so you might again want to ask what would you do next why would it depending on context why would it be good if as a forward let's say as I've mentioned just now let's say the user has specifically said oh the next thing I would do is I want to add to cart then you would say as a researcher you might want to ask if you want to add to cart then which one would you click so you're asking them to show you a lot more detail yeah I think it's also part of active listening so for example like when they say what would you do next oh I'm going to research product and then it's not going to the next question but as a researcher you should be more interested like okay if you want to if the next thing you want to do is research more of the product how are you going to do it so I think you is active listening or is it like you refer back to the previous answer and then use it to understand more about their answer so either this question has a slight little trick question because I also want to explain that sometimes the question by itself may not look like it's a good or bad question but if we take it into context it has to make sense as long as overall it's not eating you're drilling in that this question is actually okay but of course it can be further improved by saying is it of which one would you pick so which one you are already directing there may be to a specific section by itself and now the next one what other information do you want to see on this page good of that question yep it's a good of question in the sense that there will be opportunities on the content that we can add to this page we'll find out what missing information there is that people would expect to see usually go to this website actually so as Nani mentioned it's actually how do you define a month and which month are you referring to so it's actually easier for the user to recall his or her past experience rather than estimating or current so we'll show you how to make this a lot better so for example you might want to say specifically last how many times that you go to this website but honestly this can be further improved you can actually track it if possible so this will not lie okay even last month I do remember what I ate for lunch yesterday and you're asking me to remember last month if you can just track it please please go ahead I think the easiest way as well to to see whether this question is right or wrong is whether you use the same question to ask whether people go to gym or not so like how many times how many times usually would go to gym if you ask me usually I would go probably like three times a week but if you ask me last week probably one time a week so as in like it's a very good example like always go back to the previous experience because user can always say whatever they want to say but it's usually very different with what they're actually doing it just now well Nicole and Alisa were conducting the Instagram workshop over here Nani and I were also having a conversation about our Instagram usage and figuring out like how much time we actually spend on Instagram I'm not going to review anything our own usage patterns and when we went into Instagram to actually track it we were like oh our estimate is completely off just saying so tracking it might be the best thing to do so yeah we're just yeah any questions about writing questions a bit me about yeah any questions so far yes I've got one so when you do a screen and survey to the crew have you ever had any idea about using the product sorry user came in so come in and lie to you about using your products and what have you done to maybe separate that yeah okay sorry the question was when we screened the user when we called the user to come in and then whether the user have ever lied using about our product right yeah okay to be really honest I've never got that particular experience but I've got users that come in they just want a $50 and usually we give the $50 and cash which makes it even better right we got that kind of users it happens but a lot of times when we get the user so we went through the experience first and then we blacklist the user from our pool so we removed them immediately from our pool but we've never to be really honest as in like me as a user I don't know why I would lie and then usually like on the welcome you know the welcome section in that particular section is not only about setting the expectation I think it's also to build relationship with you as a researcher and then as a user with the user as well so a lot of times I would go to chat so for example like if it's an auntie or uncle then I would talk to them and then like oh so how is it uncle and then like how is it your holiday and then like blah blah so in that like probably less than two minutes session I would try to try to tell them that we are not we just want to talk to you and then we need you to be as honest as possible I think it's extremely important so previously I wasn't really I wasn't I thought like I would overlook at this I wouldn't really tell them like I would just tell them like oh this is a research blah blah and then like that's it I wouldn't make a interview I wouldn't make like a relationship a particular relationship with them but there are times where I realize where an auntie came in and then like they very hesitant to talk to us and then we built the relationship and then like after one hour they don't want to go out from the room because they're still talking about oh my holiday yesterday oh then I went to China and then like they all talk about everything about their previous experience which it's actually a very good sign for you as a researcher because you can make that particular relationship so I find that that few minutes of chit chat whatever you call it it's extremely important so that they can build their relationship so that you can assure them you as honest as possible and then like whatever you say won't offend anyone thank you yeah so my question is I'm working on a product which is as a product it's pretty much a buying canvas which you can use to do anything so it's very abstract in a sense and it's really hard if you ask these types of open-ended questions to get the specific feedback that be one because there's this like this gazillion of different use cases that you that you have so if we talk to 50 different users we would basically everyone would be talking about different things because there's so many different things so do you have any recommendations on how to kind of target the feedback to a specific feature of your product or like give them a specific use case without limiting their their thinking yep okay so I think when we start before we start the research I think again as Sheehan mentioned as well I think it's extremely important for you to come up with the questions on your own as in like what do you want to know so that's why I think you come up with some behavioural experiences that would be relevant to your this particular product and then like we would go deep dive into that particular experience and then we go ask the users how was it and then you ask like step by step into like understanding the whole thought process of how the user starts one another one as well when we do use the interview we don't purely just ask discovery questions we always come up with some designs even though there's no design workshop even though there's no designs where we always try to come up with something because when we show the designs and then like it's in a way we try to see the reaction from it although I think showing the designs is the easier way rather than asking hypothetical questions when we show the designs and then like we show sometimes a bit of the price which you buy this product for $200 for example so yeah we usually always try to come up with some designs first and then like from there we try to get feedback even though the designs is not something that is aligned between the stakeholders I think it's just a way to get feedback about the particular feature that you're thinking about to build yeah I have a kind of follow up question which is so we've been doing user testing with like with disabilities and it's really hard to find users with the real experience using your product when you go to like the very specific type of disability for example and and we've been having to run this test which hypothetical questions do you have any thoughts on how to run that type of tests or or how to increase the scope of the pool okay of the audience do you want to say it's not it's hard users so I haven't done any research in the past that specifically dealt with users with disabilities but hard to recruit users I think that's an area that I've always been working with generally I think the only answer and consolation that I can give you is we just have to keep at it and be creative and tell those patients because like for example the users that I'm trying to recruit right now nothing was helping me in the past and she took three months to find three people that's how long it took to find three people that we could actually speak to and for myself I was lucky I took like one month to find two people so I think a lot of times when the target interviews I just had to find there's no way around you just have to be creative keep figuring out like okay where would these people be how can we increase our exposure there how do we keep going to all of these different events or locations that they might be at or just like keep banging our heads at the wall and see what happens and now I sign up literally my new speech is all about building management and building management in Singapore and like what are all the events that we come for they're just like okay this is good I'm going to go for this again this is good I'm going to go for this again and I look I stick out like a sore thumb I'm like the youngest person there most of the time and I'm like this young female they're walking around like handing out my name cards and like trying to speak and then a lot of time when they talk to you like what are you doing it's extremely hard to explain we are a UX researcher we want to talk to people and then they're like what is UX and I'm like uh, UX is um so yeah sometimes we're dealing with also as well as the language barrier many of them insist on speaking to me in Chinese so I speak Mandarin but I can't speak Mandarin for work and so like this are some of the things that we have to make to learn sorry yeah so I guess this is getting a bit annoying I'm just like I'm asking so many questions this is one final question is that so do you think that asking those hypothetical questions if you don't have to pull up users is it reverse they're not doing any type of research with that type of users or like is the data so biased that it's basically useless so would you rather just not do it at all I would suggest to not to use hypothetical data I would suggest to look into other sources of data rather than interview for example if you have probably some measurement then maybe you want to look at it if there's existing papers for example like her probably the past few months I don't know how many reports you've looked at but yeah even though we don't have these doesn't mean that we stop our research a lot of the times we look at existing research that has been done in the space so we looked at you name it Accenture, McKinsey whatever research particular research about this particular space just to give us some understanding thank you let's move on we're going to see some new questions we will get to them we will have some Q&A sessions don't worry okay we have some hands-on exercise which you can refer to the handout I think it's page 32 and I will show it to you as well on this slide but yeah so basically in this exercise it's a page 32 okay it's in the last slide if I'm not mistaken in the activities exercise section so this is actually used to be it is a real project that I used to work on and I don't have enough context as you are right now so basically in this scenario we are building an EV charging station network in Singapore what we want to know is that current behaviors of cab drivers around EV EV is electric vehicle electric vehicle charging problems and challenges towards charging an electric vehicle as a cab driver the target audience is cab and private hire car drivers through partnerships the constraint one of the constraint is that you cannot disclose intent and build a charging station network as it is not published in the press next so the exercise there is posted to your table and this is a group exercise so you can do it in a group use the big post it to identify the objectives and then you can use the small post it to map up the questions based on your objectives you have 15 minutes before I start the timer do you have any questions about the context if you are confused don't worry I used to have this much of context when I work on this particular research as well but yeah so yeah I will start the timer we have 15 minutes so I will can you have time okay so you can start now orange drink drink got a bit I think maybe it is the I think it is the they have no 5 bits no 5 bits okay yeah huh sorry we braised your playlist because this is an amazing playlist I will be using it as well absolutely help you to focus okay question wait let me check so the question for you guys so I gave you the context which you can get in the handout as well but basically you can use the bigger post-its to come up with the objectives of the research and then you can use the smaller post-its to come up with the interview questions that is based on the objectives is that clear yeah okay you can give so this cab driver used to driving EV drivers at this point so you can ask questions like how many like how many times yesterday you charged your car for example or for example you can say like how many how many kilometers per day do you drive like yesterday for example objective it could be anything it could be up to you like you want to learn about the pin points you want to learn about the behaviour and specifically charging your car for example but it it could be anything and it's up to you guys let me know if you have questions again oh what kind of character yeah is this a group exercise there's a what group exercise yes without imposing effect there so I think the plan is do we know that they drive a regular car so then we can ask about their experience driving and then we can ask about the people and then so after they get used to the part about the how they charge and then how they charge and then how they charge and then how they charge convenience and then or something like what Yeah, yeah, yeah. Any questions? Any questions? Is there something we want to achieve by asking this question? It's up to you. So it could be, for example, like you want to learn about the behavior and charging EV. It could be about, I put a radio, sorry, I learn about the current behavior, and then if they have any problems or challenges in charging EV, for example. So you can branch away some of the objective from there, and then once you have your objective, then try to come up with some questions based on your objective. Is that clear? Yeah, yeah. Okay. What? Learn about current behavior and charging and electric vehicle, that's okay. Yeah, yeah, yeah. It's a whole new style here. When was the last... Okay, so I would start with the welcome question. But the objective will be first. I think into this, right, the team that went through the questions, I think we gave probably... How many? Three? Ten. Maybe ten people at each desk, we should give ten. Oh, is it? No, not everyone is ten. Kind of, we should give the number which they have. So one team only, ah? Because we have limited... Nine more minutes. Okay, ten. Our timing how? Should be enough. Six, right? I think we got one hour. I think after this, we should be quite high again, and then there's demo, right? Demo people should find it quite funny. Yeah, then we'll see all. After this is what? Conducting the interview? I think also, some of them a bit lost. I think can go to the tables, some of them. Yeah. I have another question for you. I'm so busy with you. Yeah, so you are, right? It's only good for planning to work out. I apologize, but I'm not sure if that's your thing. I'll give you a shout-out to another taxi driver. I will make sure that the people who are driving are fully charged. I have a question. Yeah. So are you going to assume that the people who are interviewing, half, they are already driving? Yes, they are driving EV cars. Sorry, I should have put it. Driving EV cars. Need to add one more context. They are driving EV cars. Hi, sorry. I forgot one more context. The people that you are going to interview are EV drivers. EV cab drivers. Sorry. Sorry, sorry. Okay, okay. You can try to ask yourself back to the previous experience. So maybe like, when was the last time you charged your car? And then you go on with all of the follow-up questions from there. Yeah. Can we have like 10 minutes for just objectives or do we have to... 15 minutes with the questions. Ah, okay. We're all making all the questions. Yes, yes. You don't have to make as many objectives as possible. I think you can come up with probably two objectives and you'll focus more on those objectives. I thought we had like 15 minutes. No, no, no, no, don't worry. So I've actually had a screen where I have questions where they actually want to know if they're married or single. Yeah. So I can catch them out of where they live, or if they're married or single. Yeah. So sometimes your database is easy to read. So I've probably been like, oh, you say you're single now. Oh, yeah, we got them all. So we had all the answers. Yeah. I just remember they are worth asking. Yeah. And also relative. So tell us about your background. Sorry? Tell us about your background. Yeah. No, it can be very specific because you know, are you married or how old are you? Yeah. Sorry. No, I'm not touching. Yeah. But if that makes it easier for you, you can try to come up probably with two or three objectives. Yeah. And then just focus on that objective. Okay. You don't have to complete the whole interview questions. Yeah, yeah, yeah. Okay. We can probably just practice on behalf of anybody, right? Okay. Okay. Come on, man. Hey, come on. Oh, come on. Alam Dari Dirao. Come on, come on. Oh, come on. Come on. Come on. Come on. Come on. Come on. Come on. Come on. Come on. Come on. Come on. Come on. Come on. Come on. Come on. Come on. Come on. Come on. Come on. Come on. Come on. Come on. Come on. Come on. Come on. Come on. Come on. Come on. Come on. Come on. Come on. Come on. Come on. Come on. Come on. Come on. Come on. Come on. Come on. Come on. Four more minutes, four more minutes guys. No bad, no bad. And they are standing up. When I worked on this EV right, I was like you reading on the reports on EV. So the point where I was about to go to the physics car whatever cost. I still want to sit. You want to see like three more minutes? You can sit. Okay, thank you so much. I really got more minutes. Got really people. How do you get exactly the segment? Are you on short tips? How to not read or okay? I can answer both I think. I think when one of the team run through the questions right, I would ask the team to review. So there is more. No one is a thief. You want to put it a bit later on the last section? Yeah, I think I prefer put it on the last section. Let's hope less than 15 minutes. One round. One minute. One minute. Thank you. I'm going to get a spoon. Tell me more about my family. I said they would eat a lot of food. We eat a lot of food. It's your home. I have to say that I like the food. I have to eat a lot of food. I have to eat a lot of food. I think I have a favorite food. I think I like food. I like the rice. I'm going to eat a lot of food. Ten seconds. Time's up, time's up. Time's up, guys. Time's up. Hello, hello. Hello. The team on the corner over there, time's up. Okay, so you guys have come up with the questions. What do you guys think about that? About the... We're not ready. Sorry, the time's up. Okay, I need one team to volunteer to run through the questions and objectives that they have. Anyone team want to volunteer? Why no one one? One team, do you guys want to volunteer? Okay, sure. Okay, so when this team is running through the questions and objectives, the other team you can ask questions to this team as well. Maybe like if you're interested on why you're asking these questions and everything. Okay, so because in the case when you present your questions on where you show your plan to your stakeholders, there will be a lot of cases your stakeholders will ask questions as well on your questions. Oh, it's a bit meta. Yeah, but your stakeholders are asking questions about your objectives and your questions, and then there will be a lot of alignment process as well. So when they run through the objectives and questions, I want you guys to ask questions as well if you're interested to learn more about their questions. Okay? Okay, can you go on? Okay, so there's a caveat that we don't have a lot of questions because there's a lot of follow up questions to the questions that anyone has. Nice. So for the behavior of charging, we'll ask them about something about the last time you charged your EV. And some contextual questions as well, like how long have you been driving this EV? Why did you choose to drive EV? And then there's more questions about charging, like how long does it take to charge? How long does one charge last? So we found out from one of the people at this table that charging an EV thing is about an hour and a half. So that's why I mean follow up questions is really important. So what you do while you're charging is an hour and a half, right? So we might find a problem there, we might not. We don't know. So then do you pay for it? How do you pay for it? Is there anything that you need to bring like an equipment to charge? And how do you find a charging station? And regarding the payment for charging EV, if they do pay for it, then we'll ask like how much did you make last week as a private hire or a cab? So then we'll have to make sure that it's more than what they're paying to charge. Okay. Any questions from other team? Wow, stakeholders are very nice. Okay, so I think the questions that you guys mentioned is very, very good. A lot of the questions is included in my research plan, in actual research plan which I'm going to show you in a minute. But you guys got a present. Sorry, it's only chocolates and some biscuits. So yeah, note for you guys, if you guys participated in our activity, we'll have some more chocolates as well. Okay, so next one I'm going to run you through the questions that I asked. You can get the answers in this link. Sorry, I didn't put the questions on the handout because then you can see the answers, right? But yeah, you can get the questions in this link. Ken, Ken, Ken. Ken, okay, next. Okay, let's go. I will show you the questions as well in the slides. Okay, so we start with the objectives. Don't worry when I start this project, I didn't have any context at all about EV. I'm not even driving. As in I live in Asia, I don't even drive in Asia. I don't know how to drive, so I don't have enough context at all in driving, plus in driving EV as well. So we start with the objectives. So the objective at the time was especially learning about EV cab drivers, learning about the lifestyle change from driving petrol car to EV. And then we have some designs that we want to test to the users as well. So we start with, I put some snippets of the questions. Actually, the questions is extremely long, but I put some snippets of the questions. But we asked about like, when was the last time you charged your car? How do you know you needed to charge your car? Because we want to see if it's, they have a regular timing of charging EV, or if they go when they feel like they need to charge their car based on the battery. And then what did you do next? So when they know like they have to go, and then like what do they do next? Do they have like a specific charging stations they always go, or they know a way to find the charging stations? And then we asked again, like, where is the station? How do you know of this station? Again, like we want to know how do they know the station? Is there an app? Is there a list? Because a charging station is not like a petrol car, no, charge pump. Yeah, petrol pump, where it's available everywhere. And then like, why do you go to the station? Is it like a specific charging station that they always go? Or is it because there's amenities around? Because one of the hypotheses that the PO has at the time was that the charging station has to have a very good amenities around. So you want to check if that's the case. And we asked like, why do you go to the station? Do you always go to this station? Do you know how many charges in the station have? Because you want to know if they go, if they go to a specific place, do they know whether the charges are available or not at that time? How do they know if their charges are available? What do you do if the station is full? This particular, sorry, this particular question is actually a question for us to validate because they have a concept or hypothesis at the time to build a feature to book the charging station. Because we are thinking maybe the drivers, they have to queue for the charging stations. So that's why like, what do you do if the station is full? And then the answers for all of the drivers, actually apparently that's not the case. They never really have to queue for a charging station because there hasn't been a case where the charging stations are full. And then what charges do you use? Because our, actually this one is, so because we want to see when we do filters on the app, what kind of filter should we use? Is it like ACDC? Is it the speed of the charging station? So that's why we put like, what charging do you use? And then we put on the bracket as in like for us as a researcher to put. And then apparently they don't even know ACDC. They only know the plug type. So they see like this plug type or this plug type. So whether it's AC or DC, apparently it doesn't really matter for them. And then how do you know if it's ACDC? And then apparently they don't know whether it's AC or DC. How long did it take to charge? And then how often do you charge a car? Whether they have a specific timing? And then what do you do during that time? Again, because we have this specific hypothesis where amenities around is extremely important, but apparently they don't, at that time it wasn't really important because apparently when they charge the car, a bit sad, but when they charge the car they sometimes sleep in the car. So apparently there's a lot of cases where the charging station, when there are slow charging stations and then it would take them about two and a half hours to charge. So what they do is that before they go out to drive, at 3 a.m. in the morning they would bring that pillow and then blanket and then they would put it in the car and then when they charge the car and then they sleep inside the car. So there are cases like that as well. And then like what do you do during that time? When to collect the car? Because our assumption is that when they go charge the car and then took two hours and then like they go out, how do they know that the charging is full? Because at this point currently there's no way to see, to connect the technology to connect the car and the app as well to see the battery progress, charging progress. And then like how do you decide when to collect the car? Oh yeah, questions, sorry. Sorry? Yes, yes, so good, very good catch. So yeah, initially we're trying to get based on yesterday as well. I think it's more on the hypothesis that we had at that time is two days, one in two days. And I would say I think that's why if we say like yesterday and then I'm scared that I wouldn't get the previous day. Although when we ask this question we realize apparently at that time because there's slow chargers they have to charge every single day. But yeah, this is the wrong question and we should have asked probably this week how many times do you charge a car? And then maybe when they say five times, five times throughout what consecutive days is it like two days a week? Is it like every day? What is the setup at that point? Yeah, so and then do you remember how much does it cost? And then like how do you pay? Do they pay like cash? Do they pay like using credit card? Because all of these questions we don't have the answers to the insides at that point. And then how long and how far can you drive after you charge a car? Because you want to see whether they can gauge whether it's when it's full time and then like how long they can charge the car. And then where else do you want to see charging stations? And this is something that is part of our planning whether when where we should plan more charging stations in Singapore. So one of the things about nothing to be said as well is that there's not a lot of tracking that we can rely on. So sometimes we do have to go back to estimates unfortunately. So this is I think this goes back to the time when we were saying like yeah some questions we know are bad but sometimes you still have to ask it anyway so that it actually depends a lot on context. So as much as possible try to get accurate insides but when it's really not possible we do have to pull back on certain things. Yeah but this is the nature of qualitative research. Anyway another example from the B2B side of things at SK Digital. As I mentioned I do utility management. I research around behaviors and workflows and challenges in utility management and commercial buildings. So the stuff that I have to find out would be like what's the importance effort and process around utility management and commercial learning such as this one. The people even care or the data is like I don't care how much money I choose to need what the cost of this building. Do I just make it as pain free as possible or is it currently pain free? Are there any challenges at the moment? So we're just going to show you some example questions that we have to give you a sense. So we know very little when it comes to B2B sometimes we know very very little about workflows so we have to fly quite blank and the questions that we come up with are often very very open ended and we serve more as a conversation guide rather than a definite you must ask this kind of thing. So here are just some example questions that we have. So usually when we're trying to investigate utility management I will start off with a very open question. How do you currently monitor your building energy water consumption? And then if they start rambling on I will skip a lot of my future questions but if they don't and they need more problems then I'll use this as a conversational guide to help myself help remind myself that these are the areas that I need to probe into. So when it comes to commercial building utilities actually there are a lot of different areas that you can go into. There are tenanted areas, commercial areas common areas like for example if you walk up who do the toilets belong to? Do they belong to the building owner or do they belong to the tenants? There are also things like what's the software that's been used? When you're monitoring utilities you can monitor the absolute amount or you can monitor the monetary amount. So the first question that we ask is deliberately open just to set the ground stage and see what's on top of your mind. So we ask how do you currently monitor your building and energy water consumption? And then after that if they don't talk about certain things we start to put more other specific areas you monitor how do you break down the consumption? Like in their mind for such a big building how do they break things down? And then could you run me through the process how do you do it? We ask them to show us their process and then we also ask them if they are monitoring then why do you monitor? If they are not monitoring why are they not monitoring? What do they do after all the monitoring is done? And then we also ask if they don't talk about the money side of things we ask them are their expenses roughly how much does energy water consumption take up? So this is an interesting question because sometimes it gets building managers on the fence whenever we're talking about money people are like why do you need to know about this money monitoring amount? This is also why we kind of invade it towards the middle a little bit where we have a sense of how comfortable the building manager is at the moment. And then since we started talking about money then we will start asking a little bit more like how do you track the bills right now? Do you plan? Is there any planning that happens? If you're not currently doing any monitoring of the above are there any plans? As I mentioned we treat CNI utility management for commercial building more we treat questions for commercial building more as a compositional guide because we really are very clueless about not so much anymore but we really were very clueless about all the processes and workflows that people have so that's why some of our questions are not very well organized and we're just like oh it's a problem I should ask this thing now A lot of times as well before this is the time before you joined I was the one who was doing this interview and then at the time it's extremely hard to catch the domain language as well because it's all about electricity energy and physics and everything and then I think when you say like I asked too much of naive questions as well and then a lot of times when the first interview I really remember the first interview that I had with the building manager when I asked the question and then he was like dough like he was really questioning like why are you asking this stupid question but I think like it's also important for you to in a way like test the questions as well so like after when you do the interview and then you kind of get each of the opinion from the user whether your questions are right or wrong and then after that you can iterate your questions yeah yeah so yeah so qualitatively said it's really a bit of a balancing act you want to be naive but at the same time you don't want to actually think that you're idiot because that game will like be totally condescending and like don't waste my time this is a useless interview so following up we also asked questions like do you monitor your tenants' consumption of meals so now we're moving on to like more specific areas and then we're not currently monitoring your tenants' consumption of meals at any plants we're doing this why or why not are you monitoring this how do you consult with all of this monitoring that you're doing so again this conversation I very often during the actual review we will add on a lot more questions we will skip questions and stuff so now we move on to tips on conducting a user interview again as part of the research ops I would say we do a lot of things to try and make sure that we make the process faster from having email templates from what is it again having a user pool we even have a huge checklist which you can get in the handout on page 19 there's a link over there but we have a template of checklist where you can download and then you can try to adapt it and then use it yourself we have a checklist of before interview and then like two or three days before interview during the interview when number one the first rule of thumb again is something that we mentioned trust is an overarching objective you have to get the interview you trust you because otherwise you're not going to share every single clock with you and you're just going to give you one line of answers you ask a question you're going to give you a one line of answer interview is going to be a waste of time sorry to give you some example as well when I when I talked to one of the EV driver at the time when they came in one of them they came in into the room and then they were like what the hell is this like why am I supposed to be here why you call me and then like they sit here and then I present to them oh yeah so I'm learning and researcher blah blah blah in this interview we're going to ask questions blah blah blah in the meantime these are some snacks that you have these are some muffins these are very good muffins and it's very warm then you know what he did he get all the he get all the muffins and he get all the sesame wah this is very good I haven't got lunch then he take all the muffins and then he ask okay you ask me anything so you realize that basically the whole just a small thing as you can can give like by providing a good food and then maybe like making them comfortable and trust you I think is extremely important to make to make you smoothen the interview as well so try not to be judgmental don't be judgmental yeah whatever they say nothing that comes off their mouth is too big they are the most what that hour and it's that hour in the hub they are the most interesting person in the world and you want to do nothing else but listen to them okay just make them you trust trust it make them feel safe nothing is going to go wrong in their room second rule of thumb try to get participants to show you I think we covered this we were talking about the questions as well try to get people to show you that process ask them to walk you through ask them to click through the apps if they mention certain things that they like or dislike ask them to show it to you so that you know exactly what they're talking about and there's no misunderstanding whatsoever the next thing is something that I think we also touch upon their rapport so like I said some of the interviewees that I interview sometimes they speak Mandarin I don't speak Mandarin for work well okay I speak Mandarin well in my home but I don't speak Mandarin well for work but to build rapport sometimes I have to go into full on Chinese interviewer mode and I'm just like Ni hao ma sorry for the non-Chinese people non-Chinese speaking people but sometimes you just have to switch do whatever it takes to build a rapport sometimes I go to full English mode I am a very quiet and introverted person by again by nature and there was once this interviewer and he came into the room he was like open the door and he was like hey how are you and I was like jala and for the whole session I had to be like yeah yeah yeah that's really awesome can you show me how you do it after that hour I just went back I couldn't talk to anyone but yeah yeah you have to match their energy level just try to make them comfortable I'm not very good with English even though I'm Singaporean but I also have to go into can you that was really good can you show me a little bit in front of you know but you know you like singlish mode and all those different kind of things yeah build a rapport mind the gap it relates to point number two on show so sometimes what people say and what people do may not be the same I'm gonna tell you like my colleagues suggest lunch locations for me all the time I'm gonna tell you that I'm usually okay with all lunch or something I'll be like no no the one the one they'll ask me like sure you want to do this I'll be like yeah yeah I definitely would so they're asking me a hypothetical stuff or like they're asking me a question I'll say yeah this sounds like a fun activity I'll do it but when it actually comes to doing it I'll be like no the one so I'll tell you a little bit and try to ask people to show you as much as possible yeah and allow for awkward pauses like this I think probably not in this culture but I'm not sure I'm not sure if it's a cultural thing or what not personally for me I'm very okay with silences but I do have friends who cannot dare silence and when I see them interviewing they're just like cutting in all the time and leaving no space for the interview to actually get a single word out of their mouth so in interviews pauses are fine let the interviewer collect their thoughts let the interviewer have silences to think and sometimes having awkward pauses and it's good in the sense that your interviewee may be someone who cannot take awkward pauses and they will be the ones rushing to fill in all the silence by giving you even more stuff that you didn't even ask them about so allow for awkward pauses so yeah we've talked about a little bit about rules of comes or guidelines for another interview then how do we take notes okay so this is our method for taking notes when I first joined I was super impressed with this but like previously this was not not how I took notes so your method and mileage maybe you may be using different tools or what not but in SP Digital how we came with this basically we may use a giant giant is unnestimating it a ginormous spreadsheet that looks like this so in the first role with the user's name and then we have questions on the first column so for example let's say I'm doing a research project on some very familiarly like Perry, Hermione, Ronald look at me I wonder where they come from and our questions would be tell me about yourself how long you've been wanting some of your challenges what do you love more what do you want more so the questions will be on the first column and you can see them and then when it comes to the relevant interviewee only the ones that we're interested in so we'll talk a lot more about this later on but now moving on what do we feel into each of the cells so basically in terms of note taking so we try to make our note taking more so that we can eliminate the transcribe process because on transcribing process is extremely time-consuming because it would double your work as well if the recording takes for example like about an hour that means you have to add another one hour just to listen to the people and then plus the pauses and everything just to like make sure you really listen so we try to make our note taking extremely comprehensive so that to the point where we are currently right now we can really eliminate the recording process because our note taking is quite comprehensive so but the rule of thumb again the guide for our process for example I think a lot of like as a researcher most likely we will think about like things to improve when you get the comments from the user I think the goal of this whole note taking process is that is more on you're trying to you're trying to record everything that happened and then you will put your thoughts and summarize it when you're cleaning up the data so for example like we need to use different colors to make CDA more visible usually we will put like for example a lot of times when the user see the design and they are confused their eyebrows are frowning so then we will put like interviewer is confused and couldn't find the button that's how we put it instead of putting a solution on the note itself and then for example like she wants to eat chicken rice this is not how we are going to usually take the note how we usually put it is that exactly how they would put it so for example like what do I want to eat chicken rice ah yes chicken rice so we always put exactly what they are talking about because how they talk through it because we realize when we put all the expression and even like the pauses and everything we can gauge how strong the opinion is when they when we see the note back again when we cleaning up the data this page is not bad so this is actually we eliminating the awkward pauses so like if there is a long pause then we put it so this page not bad so you really put into everything even in the awkward pauses so that we can see that actually to come up with this particular opinion it take the user quite some time to come up with that particular opinion because when you see this and you see the bottom right you realize apparently on the bottom that user put probably some to be trying to filter some of the words that he's trying to put that's why he put not bad but actually maybe for his thought is quite bad for example so this is an example of raw data that we usually have and then we should thank she for faking this data because she have to go through the whole storytelling process but yeah so typically this is the raw data and oh okay oh sorry yeah like this and then after the session I will go through on how to clean up the data so next one will be conducting an interview so this time will be a role play of interview who wants to volunteer to be interviewed by me yes it will be a 10 probably 10 to 15 questions about smart home devices anyone want to volunteer yes thank you give and applause my screen heaven heaven oh yeah thank you okay thank you so much for coming oh no problem okay so in this session oh my name is Lanning I'm a researcher from Espidigital in this session I'm going to talk to you about asking questions about smart home devices so more about like what devices that you have and then probably your buying process and then like how you install it the whole session probably will take about an hour and then I'm going to show you some designs what there's no designs going to show you some designs as well and then when I show you the designs it's meant to be testing the designs not you so I need you to be as honest as possible none of us here are the designers so usually there would be an observer over here and then obviously in this demo there's only me but yeah whatever you say if it's confusing if it's very bad if it's very ugly you can just say okay so we will start first on the process on a smart home so what is a smart home to you it's the ability to control different features of the home like lighting or air conditioning things like that digitally and conveniently okay do you own any smart home devices I have a Google Home Assistant okay okay which is the device that as in do you have any other devices other than Google Home no no okay so probably last week how many times you use Google Home every day every day okay and then Google Home I assume Google Home is the last item that you buy for a smart home device yeah okay when did you buy Google Home sorry I just remembered I have smart light bulb okay so the last device that you buy was the light bulb the light bulb okay but the one that you use the most the Google Home all right okay perfect and then for the light bulb when did you buy it like over a month ago over a month ago okay and then why did you buy the smart light bulb because it was compatible the Google Home and then you can just tell it to turn on or off or set it to different times to turn on or off different colors cool okay good I think it's cool as well and then what was your expectation before you buy the bulb that it would be easy to set up and it would work fine okay and then when was the last time you used it probably almost every day okay yeah okay so may I know the brand of the light bulb I think it was Philips Philips Hue okay okay how did you decide on Philips Hue I just went on Amazon okay okay okay and I think that was one that one of my friends had and he uses it so okay so it's more on because it's an Amazon and then plus your friend like it was easy to find and I know people who have used it before and liked it okay and the price point was good okay and then when you say like the price point is good how much did you buy for it so actually my boyfriend bought it so that's why the price point was good okay okay okay I would expect it to be like so I'm actually from Canada yeah so it's like around $50 $50 for a pack alright alright and then like you mentioned your boyfriend bought it do you know how did he buy it Amazon Amazon okay okay I know your boyfriend is the one that is buying but how do you how would you rate the buying process or maybe like how what did your boyfriend give any comments about the buying process in Amazon it's pretty seamless it's just like you can use Amazon Prime you get 2 day delivery all the payment information is there already so you just pretty much click and go okay yeah okay okay so I'm going on with the installation the process on installing the light bulb but so how did you set it up it's very easy you just take it out of the packaging yep and then screw it in and from what I remember it was just linking it to the Google Assistant device on your phone yeah okay how long did it take for you to set up like under 10 minutes 10 minutes oh that's quite fast yeah I wasn't it wasn't very long okay okay so after you installed it it took 10 minutes and after you installed it and what did you do afterwards it just turned at the Google home to turn it on yep yep okay colors okay that's cool but what do you think about the installation process I thought it was pretty simple no complaints hmm but I can remember hmm okay so I think that's it for all the questions that I have but would you be keen to be part of our research in the future it would be something like this it doesn't let you be in our okay that's good and then do you have any friends or family that would be you would recommend to join the session in the future okay that's all thank you so much see yeah this is for you oh thank you thank you okay yeah um what do you guys think about that session great job oh thank you sorry I go next first but yeah what do you think any comments can be as harsh as possible I'm a researcher very used to harsh comments but yeah any comments any questions about the whole session yes yeah so you mentioned that none of you are designers in that session yep and a lot of times I'm lying because usually the observer are the designer but yeah usually you would tell that oh so you would I always tell yeah I would tell that none of us none of us are the designers and usually that's not the case so usually either me so as a researcher we always try to get the designer to be part of the research process as well so that's why usually for example if there's like four interviews in a day that the researcher for example is me and then like the observer the one who take the nose is the designer for example and then we just switch turns so there are times where the designers who would take the lead on the research on the interview and then he would tell the same thing none of us are designer which actually is a designer but yeah we always try to say like that because we want to assure that the user can say they can be as confident as possible when they give the feedback I think for commercial it may be slightly different because yeah so for professional sometimes they may want to add us on intent so we can't lie yeah so for B2B side of things we don't lie we actually just say we can't be as honest as possible please don't worry about hurting or feelings we won't get offended actually helpful as brutal and honest as possible so yeah it also depends a lot on context so with politics it really depends a lot on the context yeah I don't think you need to lie you can actually just get the designer to look at it yeah because a note taker could be a research assistant oh okay so yeah usually as a researcher we always try to get the designer to be they can see the opinion in firsthand although we as we try to get the note taking process as comprehensive as possible they should be able to get kind of gauge the expression we don't record anymore so for us because we believe that putting the camera would would give like the pressure for the user that they see this camera over here so that's why we don't really record anymore we do cast it but the cast we cast it using like a webcam and then the webcam isn't on the wall so they don't really realize it that there's a cam and then is casted but the purpose is more on the casting it to the team but we don't really record it anymore there's no taking that we have the recording so the comment about recording is an interesting one so we for my the views because sometimes the building managers that we want to interview will suddenly bring their colleagues along so it will turn out from a more from a one person interview into a five person group interview and I'm like note taking for this is impossible so we actually do request for recording ten times but we do notice that whenever we type yes it affects a lot that's what I think for our case we don't record things we we do have the record at times but we we try not to where it's possible to avoid yeah so for my case when it's like a five person I'm like I'm sorry I have to record this because there's too many people I cannot take them on purpose so we record since it's perfectly fine to go without recording yeah any other questions yep hi I'd like to ask um well I've experienced user research um it's a similar one like the response was answer in some cases that somebody else had the journey or the experience like she mentioned her boyfriend and so would you continue with the questions if that happens or would it be like so accurate like the answers and how would you like okay so yeah you can see as well from because I went when she answered that particular question there's that I'm supposed to ask like so how was the experience how would you rate it and then how can it be higher but then it's not her experience I think it depends on that particular experience so I think what I should have asked her at that point because I believe that you should have bought something from Amazon for something else recently as well right yeah so I think I should have asked that particular question that is similar maybe not like Bob but as in like the same experience as in like shopping something in Amazon so maybe I should refer to something else in the same context although I wasn't sure at the time so I wasn't asking the question but yeah I think trying to get something that is most similar but if it's someone else for example like her boyfriend bought it maybe like six months ago and then she wasn't even there when he bought the when he bought the light bulb then I wouldn't ask the question because she wouldn't even worry about the whole experience any other questions can you give me an example like asking so after the interview you're looking through your notes maybe you missed something or you would like to clarify something okay okay yes I sometimes rarely happen for me maybe something that would happen a lot for Shiyoon but for me it doesn't really happen I think it happens maybe twice or three times previously that I would email them and then we can't really call them because in Singapore there's this no call registry and then like if you call this particular person the person without the person's release so that's why we really really trying to avoid calling them and we usually will try to email them just referring to your particular interview you answer this question can we get more details about it then they will answer three emails so for the commercial side because interviewees are harder to recruit so each of them are very precious so we do follow up usually follow up via email so interestingly professional contact information is not so it's not as sensitive as personally identified so we can't actually call them so it's like different but we're here I can actually call them and a lot of the managers that we actually prefer that we call them follow up and they also feel like it's a more personable relationship I think for that context also is because the person probably give a name card and then so that is you can see it as a consent that the user give the phone number but for us the phone number and then we suddenly call them I think they could really report us to the police and then we can get really into a big trouble okay you have one last question yes actually I follow you the answer that you have following up question I thought was not a very good idea the following up question was for me it's not a very good idea because when you contact an interview or research you have a context to it when you start following up the person have worked out on the contacts and the data collected might be different from your actual research so it will misinterpret some of your result of your user research okay to give you more context about the previous experience that I have I think the question that I have at that time was I was asking them it's an EV driver so I was asking them about their particular car and in the particular charging type and then I was asking that question because it's more of a functionality questions that she should know already and then we we actually catch the answer it's just like the person who take the notes they didn't have that enough technical knowledge to take the note on that particular brand or name that they put off they follow up because we're trying to get more exact answer for that but yeah I agree with your point also agree so in my case whenever we pull up usually it's because the point that is mentioned is very, very technical and we realize that after we got all these that we still don't understand the term so we will reach out to the interviewer to ask them to explain more that's the first case the other situation is let's say the interviewer actually promised to follow up so sometimes our interviewers will say I'll send you this information and then they don't and so we will actually mention this to you and you might elaborate on continuing the conversation so those are the two situations where we actually follow up yeah moving on okay okay so we've gone through the last section yeah we've gone through I was about to say half way but actually it's not half way this is the last one but yeah we'll be talking now about generating insights and reporting findings we've talked about like planning conducting it and then now we're talking about analyzing it the process when you have this all of this data I think is more about like analyzing it I was talking to Shiyoon about when we when we craft this whole workshop material I really want to deep dive a bit about into this because I think there's not enough as in like when you Google when you go to the internet and then you look into the research process articles are medium blocks a lot of them talking a bit high level talking about like oh you should do the interview like this this this but no one's really talking deep dive from how to on how to take notes and then after that then what so we realize in terms of analyzing the insights it could be divided into four process which is capture which for us I think in our context is more on like for example the findings spreadsheet and then connect connecting the insights trying to find common themes and then we craft the insights to share it with others and then story tell story tell as in like you have the insights and then how you to make it compelling with other people on your stakeholders so as a researcher I think like we shouldn't just focus on qualitative we should look into quantitative for our case we use Google analytics for our quantitative and I would say I think as researcher we should we should consider not consider but treat qualitative data same as quantitative data so that we can spend more time not only on qualitative things but also quantitative data so in terms of capture for user interview I would take example user interview and Google analytics for the quantitative part so capturing for us probably is like no taking for interview and then analytics is probably like a dashboard that you have and then like when it comes to connect and then we will have comprehensive findings that we have cleaned up based on the raw findings which you can which I will go through after this and then in terms of qualitative analytics when you have a dashboard and then you want to see more about that particular number and then you will dip dive and then probably for us we have a spreadsheet which I will go through later on as well once you have a finding you will try to connect the dots you will try to see the common themes and then you're trying to see connect the themes that you have and then find key themes and then come up with some insights and after that is more on reporting and storytelling process so previously I'll talk about and then when the note taking is still raw it's very hard to compare between each of the each of the findings each of the comments so for example when we say tell me about yourself when everything is on two paragraphs it is extremely hard for us to compare between each of the answer so that's why usually after we do note taking we try to clean up when clean up means like we summarize the findings we try to highlight the main answers of that particular answer and then like sometimes the user would answer would answer a particular question which the answer already answering the next few questions so when we do clean up we try to split it into multiple answers and then we try to put it into the right questions and the right columns so this is an example of our cleaned up interview so these are the questions and the blue ones is the what is it the answers for the summary for each of the answers so for example like how long and how far can you drive after you charge a car for example initially before we summarize it is quite hard to read to find a summary to come up with a conclusion based on the answers because this one is like two paragraphs one paragraph one sentence or whatever so that's why it makes it 370 kilos 350 kilos below 330 300 kilos so for us to come up with a conclusion for this particular questions is much easier and the same with the rest of the questions you are me and then sometimes when we do this when we've cleaned up the interview for us usually in the most cases of research we would come up with the insights of our own but in times where you want to bring more people in want to bring more stakeholders into the process you can try to do come up with the key themes using posters then together with your what is it stakeholders like for example like this one so when it comes to research findings this is going to be a quite debatable one but this is one but again this is our practice debatable life yeah we can figure it out so when it comes to research findings there are two types of research findings the first type of research findings would be observation habits which are very clear they are very concrete they are backed by evidence so for example let me jump back in this case all the blue stuff all the blue text in this row you can form a concrete pattern based on all of this all of this findings but there's another kind of finding from research which is called insight the naming may differ from from team to team people to people and there's a lot of argument and debated about it but what we got in the debate so insight is more a leap of intuition based on the qualitative findings that we have so I'm going to give examples of what is an observation pattern and what is a research insight so an observation pattern okay let's say you have an e-commerce shop and you want to figure out what communication channels you should be starting out so that your customers can reach you on your e-commerce shop so maybe you have like call messaging you have emails you have chat so you have four different channels that you're considering and you want to do some research on your customers to figure out which channels do customers actually prefer to reach you by so you go up and do some qualitative research and maybe the pattern that you find is that oh all of your different users all of your maybe 8, 16 I don't care how many all of your interviewees they all have different preferences so they equally they equally prefer two of them prefer phone calls two of them prefer emails two of them prefer chat and two of them prefer what was the last one social messaging if I'm not wrong so the finding the pattern from here is that oh there are no clear preferences for users on the on the communication channel that one so when does that need us to that observation happen maybe you down the path of like oh actually all of your okay so that's the pattern layer it's what you can easily observe forming into a pattern itself so when insight comes in is when you draw into the findings across all of your interviews and you go in a lot deeper into like why do you prefer this channel why do you use this channel why do you use this channel the most do you use this channel equally for all of the e-commerce the other e-commerce they equally and the insight which may take a leap of intuition or a leap of judgment whatsoever you call it may be that actually people don't really care again people don't really care about the channel or the views but they care about whether or not they're going to reach you in a timely manner and that insight it's going to lead you down a very different path in that it will tell you actually you just need to start one channel but you make sure that one channel will start really really well and people can reach you immediately patterns can patterns and insights one of it the insight can be a little unreliable because you are making a judgment call based on your own intuition your own prior knowledge and it's a lot up to the researcher or whoever doing the interpretation to figure out what exactly is the insight and why exactly is the next thing to do but insight is where we often find a lot of unexpected next things to do and so these are the two different types of findings that we have from the research so so we are talking a bit about qualitative insights and then when we talk about quantitative insights I think it's extremely important for you as a researcher to measure using quantitative data probably the easiest example probably is like GA I believe is that if you don't measure it if you don't if you don't achieve a baseline you can't really improve it because having a baseline and then when you can compare it with the current one that you have whether what you're doing or what you've done previously is right or wrong so when it comes to when it comes to metrics typically metrics should be three sorry there is three signs of good metrics first one is understandable understandable not only by you by researchers but also understandable by the stakeholders as well and then ratio array ratio array so that you can see the growth of it whether it's 70 percent 50 percent and then compared to the previous one and it's supposed to be comparable as well so that you can compare with your previous iteration for example and then so as I mentioned to you before usually we would have like a dashboard we would have it on TV this is obviously fake numbers she has remade it but this is something that you can easily build it using what is it using data studio so using data studio you can just integrate it using GA or Firebase and then you can have it you can turn it on TV and then everywhere in the team that you want people to see the numbers so you see the structure of our findings spreadsheet for quantitative this is typically our structure for quantitative so usually we would look at it by the time period ideally I think it should be by sprint so that you know that this particular sprint whether the design whether your research was working or not and then you would look into the we would look into the customer journey so customer journey for our case is like for example whether they what is it sign up and then actually sign up whether they monitor the utilities and then like whether they make payment and then like whether they refer to other users and so on so we would look into our customer journey of the product we would look into product and features that what is it how particular features that we have we would look into releases as well so that we can monitor for releases which releases is working on so that we can measure for exact releases and then we would look into campaigns as well because mostly if you're in a big company you would have like a marketing team who does the campaigns marketing campaigns and then like sometimes there are a lot of time because it's quite siloed and then like how come the numbers change that is that is talking about something else about our app for example so yeah so this is an example obviously fake numbers again this is an example of our our own dashboard not the dashboard not the data studio dashboard this is our own team dashboard where the designers and researchers would look into the numbers so we would look into the monthly and then we would look into for example for customer journey we would look into each particular screens and then on the bottom of this usually there would be like campaigns and releases and then some research efforts as well so that you can measure for example you did research particular over here and then the the the recommendation is built on that particular month and then you can see the next month whether the numbers are improving or not something went wrong in April yes yes fake numbers yeah research now what now it's time to share it up and because we only have 10 minutes left I'm going to speak through the rest of this section really quickly so when it comes to crafting good insight there are three principles under I'm over using the word guidelines here but generally we would keep them informative you would want them to be each you want them to be inspiring and informative of like what you're doing you also would want to try to make your insight memorable so that people can recall it and people are not looking at your report all the time while figuring out the product number of decisions so you better make them memorable so that people can have them at the back of their minds and it's like part of their unknown so we talk a lot about our note taking our for qualitative and positive our data is very very comprehensive this helps to keep us honest and it helps to keep our insights believable so that we can easily find to say this insight is backed by this data you can go back and look into it as well the second tip is that let intuition guide you which is what we just talked about sometimes that will help you do something great so spending a lot of effort into having very comprehensive data ensures that we are not so less biased in our insight generation as well we also try to do insight generation with a team to catch any missing perspectives and again to keep ourselves honest and make sure that we're not being too sweet by the loudest voice in the room we try to get feedback from maybe like after a quick report and we present this to someone that we trust so that we can think that quickly doesn't make sense is there anything that's missing we try to tell specific stories like I noticed that everybody listened to Nanen talking about the EV driver who bought FLOs to the cast so we try to tell stories like this and also I'm not sure if I'm pronouncing the name correctly the first look there are very specific stories to make it very engaging and then we add quotes and pictures as well to again increase the engagement factor make people understand and believe that there are real people that we're talking about there are real quotes that's all the grammar errors singlish any single language that we are speaking in all in there so this is an example of our report and this is the actual report that we put so the only thing that we change initially was a picture of the the users we always try to make we always try to put pictures of the users and then we always try to ask consent first from the user whether they're okay with it we put the pictures of them into the report but typically this is a structure of our report for discovery insights these are the main insights so we try to make it short and memorable as well and we is more of like a pointer or the most important thing about the conversation but obviously when I presented to the stakeholders I would go deep dive into each of it but I wouldn't put all the insights for each of it on the sites so I'll put into the most important things and then I will put all the quotes into the into over here so that the user the audience can read the quotes and then see the quotes over here is the quotes that are supporting the insights over here so that's why we can see so for example I will go through this and then I will tell yeah as you can see there are two he's saying about yadda yadda yadda so this is usually the structure for discovery insights and then typically when we test the designs one of the most important thing when we show the report is when what the audience typically want to see the most is that the recommendation so as a researcher is extremely important that you're not only telling the insights but also to tell to give a clear recommendation as a researcher my recommendation is to do design A, B, C, D, E or the design should be changed into this, this, this, this and you don't have to show all of the insights that you have I think it's also your job as a researcher to get to understand what your stakeholders is looking for and then show only what your stakeholders is looking for and what you think is the most important as a researcher so that's why we always put like the before and then the after as well and then we always highlight why this one is not working why this one is not working so the right one is the recommendation that we're going on the reasons of based on the interview yes so I think this is the end yeah so okay maybe I will jump to the recap and then we'll start fielding questions so we have two from the posted so a quick recap of what's happened to you for the days workshop so we did an introduction to user experience user experience research we did planning we covered like what we do to plan a user interview we covered the general interviews and then we also talked about how we generated insights and report findings and that's in the meantime while we're asking questions if you have some time please go in the feedback yes please first kind please doing this workshop yes and feedback will be very useful for us for future workshop and but in the meantime okay for now we are going to start answering some questions yeah okay thank you okay first one is how do you get your exact user segment by recruiting agency active user or accept so I think one of the things that we I put it in the handout as a note but I didn't I think we didn't mention it in the slides is that when you approach a recruitment agency you should know exactly what you are targeting so it's not like you work with the recruitment agency trying to figure out what's on the users that you want to target and then you go to the recruitment agency know exactly okay I want the user from for example 40, 50 years old using our app have this at home have this my home device at home blah blah blah blah so when you approach a recruitment agency you should know exactly who the people that you want to target I think it's not really about maybe it doesn't really happen for B2B but for us it happens a lot for example when we recruit 7 typically when it's raining on probably the user never really what is it reply our emails mostly they would bail out so it happens a lot so usually that's why we always use meeting invitation and then when the user doesn't accept our meeting invitation we always trying to chase them again you haven't said the meeting invitation do you know about this what is it about this interview or do you want to take it or not those one of the things that we do another thing that we do is that we always try to buffer the users that we recruit so for example we try to recruit 7 we usually try to recruit a scheduled line and then like we probably if it's if all of them turned up then it's good we get more users but then like if we have more questions wait this one first I think yeah can you give a practical example of how you qualify interviewees and a screener what? in the screener can you give a practical example on how you quantify qualify okay so I actually put okay the question can you give a practical example on how you qualify interviewees and a screener so I actually put an example of screening on the recruitment section we put like an example of our type form so typically we will send a screening form using type form and then we will blast the type form into our pool we know exactly what so usually we would put stuff like we have some screening data in our pool already so for example whether they use our app where they live not exact address but probably like area and then like whether they have smart home device so for screening questions it's always different for every research rounds but that's why the screening questions always different for every research as unlike because the targeted users are always different for every research rounds usually it is something that is aligned between you and the PO for me as in between me and the PO it's something that is aligned for example like I want to go for our SPU for this app users I want to go for non-app users like banking for example so we have a very specific target and then we will put it into type form the questions and then the question would be yes or no and then like we will see the submission and then we will handpick the user based on the submission yeah you one on okay so how do you conduct user research with non-local users so I haven't conducted user research with non-local users here but I had done so with in my past experiences and through listening and talking to friends so for me a lot of my interviews in the past were actually conducted through video calls so you lose a lot of the body language there but a lot of the users that I did they were based in the US or they were based in Australia or they were based in Europe and for me it's way too expensive to find me around so often so very often I just had to make do with yeah video video calls and and we figure all the way from there sometimes do I did use to do like some overseas trips where like you try to if you do it overseas trip you will try to pack that trip as much as possible with all the interviews and all the customer business that I can do so I've always been doing B2B so you will try to pack as many like office visits or customer business as possible we have some notes about remote research so previously in my previous work I do a lot more actually most of the stuff that I do is remote research as well we put some notes as well in the handout about remote research if you want to learn more about remote research I forgot which page but yeah we put some notes about B2B research and specifically about remote research as well but there's another question about what's your thought about settling sending to getting oh getting sorry getting to observers in a session okay this happens okay initially for example the EV one they are I'm very thankful I had a very good stakeholders where they are extremely interested with the user that the main stakeholder told the whole team which consists of 20 people that is mandatory for them to go to the interview which is impossible for me as a researcher to allow that many people to interview a particular person so that's why what we did is that we cast the whole session using Skype in our office so we we did the interview somewhere else in there somewhere near their charging station so it's a very fresh experience after they charge in and they talk to us and then we cast the whole session to our meeting room but in our consent form we already said that the whole session will be recorded and it will be casted so they should have known already that everything is casted but when they go through talking to us they don't feel like it's watched by the whole people so again it goes back to when you're with a stranger and they're asking you questions like how many people do you want watching you answer the question it can feel really stressful so again it goes back to all the trust and comfort and rapport so that's why we are pretty strict about like how many people will that interview so there's another question about can user interviews be done by sending online forms for example forms instead of face-to-face interviews if it's a form I would say that would be a survey because in a form you can't really ask follow-up questions you can't ask like why is it like that how is it like that while the gist of the user interview is more focused on the how and the why and you wouldn't be able to get the how and the why from form and then sometimes in a form you can't really make sure that the answers that you're looking for the answers that they give are the answers that you're actually looking for so you can't really control it but if it's a one-on-one session I think you can really control like oh what I'm asking is this a lot of expression so I'm not saying that you can't really using Google Form but maybe you might want to rethink about your methodology maybe if you can't really do user-interview whether you can survey or not maybe something that you should consider Survey is a different methodology for you to find different forms of data maybe you want to blast it out to more people and that's when you would use a survey but with the beauty of a user-interview is that it's very rich in data and very very who would spend more than 20 minutes filling out a survey yeah so yeah so it's a very different method of research in January it's very different data for you okay time nothing so I think the time we have exceeded a little bit so thank you so much for your time today you've been an excellent audience if you're back now I think thank you yeah