 So, a little bit of background on me. I know some of you may have been in my workshop yesterday. I trained as an aviation human factors researcher. That's actually how I met Steve Fadden. We learned that we had advisors who were collaborators. I built small and large teams in different size organizations. As you heard, I teach at Kent State in the UX Masters program. And I'm a husband and dad. That's me and my kids trying to make cake. And as I said yesterday, I'm also a Pixar plot device. It usually goes through people's minds, so I just get it out of the way. This is what we'll be covering. What user research is. Why do we do it? And I said before, we'll go through it fairly quickly. And then how to do it. But I suspect many of you know how to do it, so we'll get right to the resources. What's not here is things like how to build a UX research team or deep details on analysis and communicating findings. However, I do have presentations and content that you can get to listed there. You don't have to copy the URL. You can take pictures if you want, but you can always grab the slides. Let's do a quick definition for some of the folks that are in the room who don't have a good sense of what field research is. Has many names. I don't really care what we call it. I actually prefer to call it what resonates with our stakeholders. If they call it like I mentioned yesterday, follow me to the office or going to the field or voice of the customer in the field, that's fine as long as we're concentrating on the person or people in the real world. Why do we do this? I think we all know the answer to this. Because context, watching people in the field is king. This is just a play on the words content is king. Context is king. Why is that? Where do you learn more about what people's tasks are and their goals are? Do you learn it in a room like this or in a room like this? So I heard many of you or you indicated many of you are researchers. What's the most exciting part about doing the research? Anybody? Like that? Yes. Going here and understanding them and their world. Yes. There's nothing more joyful than going and watching and learning. And the first hour or the first day, depending on the complexity of the domain can be so confusing. Maybe even the first week. If you start seeing the patterns, it's a great feeling. Why else do we go? Behavior doesn't lie. People don't intend to lie when they're in a sterile office or a research room. But they cannot accurately recreate the steps that they take and why they do them. They're just kind of confabulating or just kind of the paradox of the expert. They have become so efficient in their work that they can't explain the minute details. But we as the outsiders, I just found the microphone, we can see the small details because we're not familiar with the chunking that they've done. And when you observe people in context, what's better than being able to ask follow-up questions? You can't do that in a survey. You could ask follow-up questions in the sterile room, in the research room, but you really won't know what follow-up questions to ask if you're not seeing them doing the thing. So almost all observation projects are run for two reasons. One is evaluative to watch how they're using your existing solution and finding ways to improve it. And increasingly, I love the word generative because it shows how closely we partner with our product owners, with our product management teams. I've found that increasingly over my career, I have delivered more and more value and my teams have delivered more value by working with product management to help them understand what the requirements are and what stories to write. Has anyone else had that experience of becoming close partners to product? Yeah, it's a great feeling, isn't it? Makes us feel like we're not going to get laid off. Okay, how to do field research. You will encounter resistance often unless you are so well integrated with your product team that they regard you as indispensable. Here are some of the typical objections I, judging from the room, I think we've all felt. Project manager says, well, this takes too long. I've got my sprints laid out. We can't do this. The founder or, you know, the VP of sales. I know what they need. You don't need to go out and watch. Or I'm not going to let you because we're doing contract negotiations and I don't want you speaking to the users. And then the marketing team. Let's run a survey. Another survey, hooray. So how do you counter some of those objections? It takes too long. Those of you who may be familiar with the agile design sprint or the sprint zero concept know that you can do quality research in as little as two or three calendar weeks. And if you can get some planning done in parallel with early other activities that are pre-sprint, you can do a one week research sprint. It's possible. I see some head shaking. Yes. I mean, it's tiring, but it's exciting. Know what the customers need. This is a hard one to counter. I'm not going to lie. I've had difficulty overcoming this objective. But if you can frame it in terms of do you want to lower risk? Do you want to make sure that you release the right thing? That can help. Unfortunately, many CEOs and many high level executives feel like because they understand the business and the domain and the finances that that makes them qualified to be Steve Jobs Jr. It's a really tough one to fight. Have many of you heard of the acronym HIPPO for the highest paid person's opinion in the room? Yeah, it's hard to counter. And if anyone has any good ideas on how to do that, feel free. We can just run a focus group or a survey. And this one, you can flip a marketer by bringing them to the field. Just like you can flip an engineer to being a supporter of UX by bringing them into the usability lab, you can flip a marketer's concept. A few walks out into the field with them and they'll realize that surveys are complementary but they can't answer everything. I'd like to cover just some philosophical considerations and then we'll get into the practical considerations and then talk about some of the resources I have for you. I was actually taught this when I worked at Intuit which creates QuickBooks and TurboTax by the founder who was doing user experience when he was first creating the company. So he would go to the boxed software stores back in the 1980s and back in the 80s and watch when people picked up a box of his software and asked them if he could follow them to the office or follow them home and watch how they used the product. And while that probably creeped out some people, he had enough people participate that he was able to gain an understanding and he came and he talked to us after the company grew and he said, you need to ask questions because you don't know the answer. Don't ask questions because you want to show how smart you are and I've always tried to remember that I'm not trying to demonstrate what I know, I'm trying to learn what I don't know and some practical considerations. You need to decide what you want to investigate. It's okay, especially in generative, if you want to take a journey and you want to explore what's happening in the world of the user and their context but you need to have your stakeholders align that that's your goal. I've been out to the field where the expectation is that we would rigidly define how the product could fit into their existing workflows and brought home more exploratory generative data and that did not go well with my stakeholders. Here's an example goal statement. So during discussions with the client, we identify the following goals and constraints. We want to study and document current users' workflows, establish where the product impedes workflow efficiency, uncover users' wants and needs, for increased workflow efficiency and data presentation and then redesign the workflows. Very concrete and then we don't have to read that but it is important to make sure your stakeholder identifies your constraints and if you ask them, they'll say well, probably number one, we don't want to lose people along the way. I don't know where the laser is, I'll just point. So do not connect from the installed base, meaning do not connect from the existing users, do not design the workflow out from under them because there's a lot of people who have become efficient at this non-optimal workflow but then it becomes a decision of do you build, do you fork the product and create a separate workflow for new people versus keep the old one usually a bad idea but sometimes it is the right thing to do. Here's another one that I encountered in this product, in this particular project for this product suite, preserve the existing shortcuts and accelerators, makes sense, right? This is just another example of don't disconnect and pull the rug out from the installed base and then some other things about the technical environment. Do that, make a plan. It's not always going to go according to how you planned but it's so much better to have a plan. Yesterday we talked about big part of our jobs is being a change agent and driving UX adoption. This is the other piece, being a good project manager, that doesn't mean you have to go get your PMP and spend 2,000 hours and have a second career but just use simple tools and I have this built into the resources that I'm going to share. It can be as simple as a Gantt chart like this but if you don't have this and you don't start with something, no one's going to know where you are. Other considerations, what type of data do you want at the end? So you can go in and start with the idea of we're going to code everything we see into existing groups that we feel are the right ones. I don't know about you guys. I do not usually go in with a structure, imposing a structure. Has anyone, I'm not going to criticize, I think it might actually be the right thing. Has anyone actually done a research project where you were very clear about what the behavioral categories were? Okay, so affinity diagramming and post-its? Yeah, so unstructured observations usually reveals more of the unexpected recruiting users. There's so many researchers in the room, I don't think we even have to go through this except try to get to the actual user. Don't let marketing or sales say, well just talk to the people who do purchasing. How many of you feel 100% ready when you go to the field that first day? No, it feels like, remember when you were in the school play and everything was getting all ready and you heard the people tromping in and then you were behind the curtains and you were nervous? Well that's the same feeling. You're always going to get that feeling and that's why we have structure and that's why we have experience. So that we can get beyond those first day jitters and just realize that you're probably going to stumble in your words or maybe miss a few things. Frankly, one of the things we can take away from survey research is it's okay to run a pilot session. It really is. You can treat your first session as your pilot session and then debrief with your team and say, okay, let's change the protocol in these ways. Ask these different questions, et cetera. Collecting data. I have a very idiosyncratic way of doing it. I type very fast and I can touch type, so I use a laptop and you don't have to be able to read this. I have the link to this and it's open source. You'll be able to use it if you want, adapt it. Use a format that works for you. When I was at UX Singapore last year, I met a researcher from Australia named Ruth Ellison who actually uses Microsoft OneNote. She takes notes in free form format, so not with the structure, with the question and then sub-questions. She takes it into OneNote. She also uses OneNote's audio recording capabilities. She tags and it was amazing and I want her to give my students a workshop on how to do this. I thought it was fascinating. Frankly, using pad and paper also works. But I would say summarizing daily, no matter how tired you are at the end of two observations or three observations in a day is important because this is the way our minds work. We lose the detail. The longer we wait after a particular session, the less rich the information will be and the more affected and corrupted it will be by the other observations that we've done. This is an example of a very rich summary of a visit. It doesn't have to be like that. This particular client that I worked for wanted to understand what are the main findings, what does it mean in terms of how we should change our product. But typically my summaries, my daily summaries or my visit summaries tend to be a few paragraphs with some bullet points and that helps align stakeholders and keeps them in the loop and understanding what we're doing, what value they're getting out of it and why. Good, now we're at the resources. Time check, how are we doing for time? So this is something that I put together but I don't want to claim all the credit. I've adapted it from other people so there's a saying, if you steal from me, you're stealing twice. It's available on Google Docs. It's open to the world. And it's simply a planner. It's a project template planner. You've probably seen a version of this many times. You probably have something like it. Take whatever you want from it. It's open to the world. Remember I was saying about the daily session recap? It can be as simple as this. So this is an email. I showed you a document before. This is a simple email to your stakeholders saying, daily recap for this day. Here's what we did for prototype testing in the field and who were the participants and some of the highlights. That's all. The great thing about this is it writes the report as you go or it writes the debrief as you go. Less work at the end. I think let's not do this since there's so much experience in this room but I think we all know that there's the analysis piece of breaking down the observations and then synthesizing. The slides make it somewhat clear. Is there anyone who has not done an affinity diagramming session in the room? Okay. If you want, we can chat afterwards. I'm happy to talk to you about that. It sounds like we have a lot of deep experience in doing this kind of analysis and synthesis. So people won't read a giant report. I think we've learned that as a field over the past 15 years. So what do you do? By the way, this is the cover page of a report 80 pages. Work with your stakeholders and do a stakeholder session, a debrief session. Gather them, present the raw findings, let them draw conclusions as a group. Some of the pros, it gets them involved. It gets them involved in the data analysis and the synthesis. And they may have the insights because they have deep domain experience. And as I said, some of the cons are they may start to dominate. And there are ways to facilitate a session and avoid the highest paid people from dominating. Things like brain writing. So writing down insights quietly yourself and then sharing them as a group and going around and doing the affinity diagramming. The other thing is letting the lowest, the most junior person go first so that their insights can be heard without having the pressure to agree with the hippo. Interior decoration. Another thing I've seen more and more over the years is that we have really defined the idea of putting the information out into the world as a positive. And there are some examples. This is an actual team, you know, the scrum room. We talked about this in the beginning. User research almost always is sprint zero work. It's before the stories and the epics are fully defined and frankly, they're there to inform the stories and epics. We can do it quickly. If you get the chance, look up Google Ventures one week design sprint. I see some head shaking. I think there's valuable insights in that. I like that model as well. Remote research. I've been doing more of this just on the phone or via the web. There's logistical challenges. Sometimes it's hard to get people to load a little executable on their users. There's challenges. But it's better than nothing. And finally, here is a research comparison tool. So something I put together if you're evaluating different remote research tools. This just happens to be comparing, I think it was WebEx versus Zoom conference. But if this is something you need to make decisions about what tool you want to use, this is available for you as well. So that was the two-minute mark. Wow, I feel like I'm right on the mark. Questions? No questions? Are we good? We do have one. Can we do one? Go ahead. Yes, customer journey map. So the question was which techniques do I use to synthesize and summarize the data? And it really depends on the customer, the stakeholders' needs. I find journey mapping to be effective, but they tend, if you're not careful, they can grow into a beautiful piece of useless information. I think they need to be very constrained and just stick to the facts. I think the reason they get very over-decorated is because there's a lot of consultancies and people feel the need to deliver something beautiful, and so it becomes over-engineered.