 So this is the user testing crash course. I'm Jeremy Krigel. I'm the UX director at a company called Omnisfel. We do pharmacy robotics, which is like automating medicine dispensing, inventory management, et cetera. So I saw a keynote number of years ago, wonderful writer, Kathy Sierra, asked the audience a question. She asked, which would you rather have people saying how great your company is or how great your product is? And so some people raise their hand for companies. Some people raise their hand for product. And of course anytime someone asked you a question with only two options, it's probably a trick question. The answer is always C. The answer is how great they are because of your product or service, right? So there's things that you use that are frustrating. You hate them, but you have to use them to get your work done or try to accomplish something and your customers are the same, right? They want you to succeed. In fact, they need you to succeed so they can be awesome. Kathy Sierra also has this wonderful graphic she calls the kick ass curve, right? If we look at ability on the left and time on the bottom, she puts these two thresholds. There's the suck threshold and that's when your ability is low, you're still learning and it kind of feels like I'm not good at this and I'm terrible and it feels bad, right? I'm not very effective. And then at that higher level, there's kind of that passions threshold or when you get into flow, like where you're really good at something, you're effective, you feel great and you lose track of time, that's when you're thinking, I rock, I'm awesome. So the faster you get people over the suck threshold and into the flow, the more likely you are one to really be serving people's needs and to create loyal, dedicated customers. How do you do that? We've got to talk to them and understand how effectively they're using your product or what problems they're having that you could be solving. And as obvious as that is, and I'm sure most of you are thinking like, when I said that, like, well, of course, we know we have to talk to people. My experience has been is that there's a huge reluctance to actually get out there and engage with the people we're trying to serve. I've seen this in many, many different companies over and over again. And I get it, right? Talking to people can be daunting. We're really busy with other things. And if we find out things that, you know, where we're wrong, we might have to do rework and we're on a calendar. And it's this is out of sight of our comfort zone all kinds of reasons. But if you have a plan and you can be a little more comfortable when you kind of have some more confidence in how to navigate things, I find it can be a little bit easier. So I'm gonna talk about just a couple of things to get started and then we're gonna dig into the 13 tips to make it far more effective. So the first and most important thing is you have to have a focus, like why are we doing this? What do we hope to achieve? What are we trying to learn? What assumption are we trying to test, right? So a couple of questions you can ask are, where are we foggy? And that's FOG for facts, opinions and guesses. What do we think we know to be true that we need to validate those are our facts? What are we sure? We're kind of sure. We think that's true. And again, we might want to validate it. And where are we just guessing? That's probably where there's a lot more risk. What are our known knowns, known unknowns and unknown unknowns? Again, the things we think to be true that we might want to validate, the things we know we don't know that we need to investigate. And then there's the things that we haven't even thought of. We don't even realize we don't know. And for me, what I found, what I think is interesting is that the known knowns and the known unknowns seem to be kind of shared across the industry. Everyone kind of knows the same things as people move around. You read the same reports, yada, yada. And we kind of also know what we don't know. So the unknown unknowns, that's kind of the interesting part. That's where a lot of the insights that lead to innovation come from. And kind of finally, one of the last questions that I like asking is kind of like, where if we're wrong, we're in trouble. Let's just put it that way. So the first thing you want to do is make sure you're really clear on your focus. What are you trying to learn? What assumption are you trying to test? What do you want to get out of talking to people? You need to have some guide so you can make the session productive. So that's your why. And then, okay, who do you need to talk to to learn that? And the thing to remember is you're not your customer. Even if you were at one time, once you have joined a product team, once you've joined an company and you're making products where you've been someone like you, unless you are making products for other like product owners, designers, whatever your role is, you're no longer your customer. I'll give you a great example. A number of years ago, I worked at a startup that was serving physicians and it was founded by a former surgeon. Now that gentleman knew far more about what it was like in the day-to-day life of a physician than I will ever know. But at that point in his career, every day he spent as a CEO, he was becoming further and further removed from the day-to-day life of a doctor seeing patients. And so again, even if we were in that role before, we're not now, now we're product people. So we really wanna try and get as close as we can to the people we're trying to serve. So let's, who are those people? And one simple way is we can, let's try and define their graphics, demographics, psychographics, pharmacographics. Demographics is kind of the data that describes people, age, gender, income, all the data. Psychographics is kind of more about the mental landscape. So kind of in marketing, sometimes they refer to it as AIO activities, interests and opinions. What do these people do? What do they believe? What do they think? What are their hopes, fears, et cetera. And pharmacographics is kind of like demographics for the workplace. So what's the industry? What size of the organization? What's their role? How experienced are they? How senior, et cetera. And you might not need all of these things depending on your problem. Just to give you two quick examples. I worked for a customizable jewelry company many years ago. We were doing a study around engagement ring buying by heterosexual males. And so the demographics, we kind of knew a rough age range where a lot of people were getting engaged and income was important at that point. Psychographics were really important because we knew they didn't know a lot about jewelry shopping and they were really uncomfortable making some of these decisions considering their lack of knowledge. Pharmographics were completely irrelevant, didn't really care what industry, where they work, et cetera. And now working in pharmacy robotics. Demographics, if I'm looking at a director of pharmacy in a big hospital, the demographics might be less important. And they certainly be at a senior level you're at a certain age range, but it doesn't really come into play in terms of helping us make better decisions. And that's really the thing here. You wanna be able to make better decisions. Psychographics are relevant in terms of what's important to them, what they spend time on, where their focus is, how they adopt technology. And pharmacographics, what the size of the hospital is, what their level of experience, that's really important. So again, different problems will lead to different emphasis here. So now with that, I can create some assumptions. Again, I'm making some guesses about who I think I need to serve so I can start the conversations. So that's why I have my why, I have my who and then there's what am I gonna do? So one of the most basic things, right? We do is usability test and we put a product in front of someone, we give them a task to do. We see how effectively they can complete that task and that gives us a sense of where we need to make improvements. For that, for a real-time moderated test, you're gonna create a script, you're gonna have a location, possibly virtual, and you want two people, a facilitator and a note-taker. You probably have observers too, preferably in a different room. We'll get to that later. There's also unmoderated testing where you can just put stuff online with a script. It's good for confirmation but you don't really get to explore things. Again, I'll talk a little bit more about that later. Another technique that's relevant in today's conversation is ethnography and that's more about discovery. Usability testing tends to be weighted more towards validation with a little bit of discovery and ethnography is much more open-ended. So I'm interested in this problem or this space. So you wanna go spend time with people, watching them do what they do now to understand what's working and what's not, how they think about things, then you can figure out how you might craft a solution that fits in with that. So those are the two things that today's techniques are gonna focus on. Now there are far more research activities you could do depending on the problem that you're gonna learn. You might put out a survey, you might do some cards and sorting, you could do a tree test but there's lots of research out there. But again, we're gonna really be focusing around these real-time individual interactions. But again, when you're planning, I thought I'd throw this out there. This is a variation on a canvas that Jeff Patton created but you can start with your hypothesis, your bet, what you believe to be true. What are the assumptions baked into that and what are the questions that you need to ask? Of those, what's the most important thing you need to test or learn now? That's number three. Number four, what are all the possible ways you could either test that assumption or fill in that learning? And then five, what are you going to do? And ideally that's the thing that you can do with the least amount of work that will give you the maximum learning. You're gonna still wanna maximize your ROI there. And then lastly, number six, what did you learn from that? And this kind of, there might be more than just what's gonna fit in the box in terms of report, deck, et cetera. But at least now you have sort of a one-pager that closes the loop. So if you have people joining your team or if you wanna go back and reference activity you've done in the past, you have these quick summaries you can go to. So in that first part, you have to have a protocol and there's lots of stuff out there. So I'm not really gonna go into a lot of the depth on this, but it's gonna have four basic parts. An introduction of why you're there and what you expect and setting some ground rules. Some context, which is usually questions to frame the activity and put the person in the mindset of what you want to talk about. And then there's gonna be some tasks you're gonna give them or other areas of investigation and then you're gonna close it out. Again, lots of stuff online about that so we're not gonna dig in deep. I wanna spend most of my time talking about the nuances that make facilitating these conversations effective. All right, so here's number one. You wanna keep the environment comfortable so people don't feel like a research subject. And I kind of frame this as you wanna be just on the warm side of neutral. If you're too encouraging, you run the risk of creating an environment where your participant, instead of giving you their honest thoughts, we'll start thinking about what do I need to do to make this person happy. They're giving me all this positive encouragement and this positive feedback and now I really like this person and I wanna tell them what I think that they wanna hear and you don't wanna hear what they think you wanna hear. You wanna hear what they actually do and what they actually believe. So a little bit warm, but not too encouraging. And you don't wanna be cold either because that can be kind of like distancing and a turn off and then people might not be open to sharing. Just the warm side of neutral. Again, don't offer encouragement either. You can give a complimentary feedback like say someone says something like, wow, I really appreciate how you articulated that. That was very clear. Thank you for sharing. So you can participate in the conversation that way, but if you say something that's more like, oh, that's great, that's right. Something that implies there's better and worse feedback than again, the participant might try to figure out what you want and try and give you that so that they feel like they're performing. So that is taking that neutral attitude, maybe slightly positive. One other thing I'll say when you keep the environment comfortable, and this applies both in-person and online. You can imagine, earlier I said that you need two people. If you have a lot of people participating, especially in-person, it can really make that person feel like they're being studied. So if you have one person with a half dozen people huddled around them, that can be kind of uncomfortable. So especially for in-person, only have the facilitator and the note-taker in the room, and then other people can be, you can stream that content to another location and people can watch from there. Even online, try to, you know, if there's ways to hide the participants so that they aren't maybe aware of how many people are in the room watching them. They're a little easier online because once you get started, they'll sort of forget about all those other people, but you definitely want everyone on mute. You don't want anyone else chiming in. That is actually a role that the note-taker can play. If other observers have a question, keep a Slack channel or some other messaging channel open to the note-taker, people can message their note-taker and at that appropriate time, that note-taker can interrupt the facilitator. Excuse me, I wanted to ask a follow-up question and ask the questions on behalf of the other observers. So that's how we help keep it comfortable. They don't feel like a research subject. Part of that is that neutral attitude, just slightly positive, right? All right, that's one and two. Number three, you want people to think out loud as much as possible and that's because we want to understand what their mental model is. So anytime someone goes quiet, I'm going to ask them a question. What are you thinking? What are you looking for? What were you expecting? Something that's just going to keep them talking because the more they talk, the more I understand. So anytime someone's quiet for more than a few seconds, ask them a question again and talking. This is also typically one of those, you know, I said, you know, you're going to set context in your protocol. This is one of those things we often are very clear about upfront that we want you to talk constantly and if you're quiet for even a moment, we're probably going to ask you a follow-up question just to kind of understand what you're thinking. All right, next one. You want to keep it simple in terms of language. Clear language, no jargon. Even if you're in an industry that has a lot of very specific verbiage that you would assume your participant knows, it's better to minimize it's use. Just in case they're not familiar with that particular term, you don't want someone to feel stupid. That will tend to make people shut down. So even if there is a specific term, try to say it in a general way that's more accessible in case they don't are familiar with that term. Keep it open. So you want to use open-ended questions as much as possible because they expand the conversation. If I asked you a closed-ended question such as, did you eat breakfast this morning? Well, that's yes or no. It doesn't really tell me a lot. I can open that up say like, well, what did you have for breakfast? That's a little bit wider. I could say, tell me about your morning routine. That's even broader and I can find out whether breakfast was part of that if I care about breakfast. Now, especially when we're not practiced at this and even when we are, it's not uncommon to end up asking a closed-ended question. That's okay. You can follow it up with an open-ended question. So in my simple breakfast example, hey, did you eat breakfast this morning and someone says, no. Oh, okay. Well, what is your typical morning move like? You know, then you can open it up from there or how does that compare to your normal daily routine? And then they'll, they can start talking about that. So even you can always follow up a closed-ended question with an open-ended question. All right, this is one of my favorites and this is from Byron Holtzblatt, talked about this in their book, Contextual Inquiry, which is still a great foundational text for how to engage with people and really learn a lot about what they need. And that is to think like an apprentice. Usually when we think about research, we think about the sort of the scientist, you know, when they're white code and they're clipboard or whatever, studying the lower life forms. And so like we're doing the research, we are obviously these superior beings. We are studying these other people, you know, our little organisms that we're gonna, we're gonna help out, right? Help benevolent of us. We really want to invert that. They are the experts in what they do, right? So we're not the experts, we're trying to learn from the other, these the people that we want to serve. So we need to think that as an apprentice, they're the ones with the information that we need to get. So we need to invert that hierarchy and think of ourselves in that lower position and put the people we're talking to in the higher position. Now along with that, what someone else says is true to them. It may not be objectively true, but it's true to them. I'll give you an example. I'm gonna go back and get into the jewelry company. When we were doing that research study, one of the issues that we were aware of was that guys don't really have a lot of experience in how to correctly identify the ring size for the person they want to buy a ring for, especially if they want it to be a surprise. So one of the questions during the study we would ask people is, you know, during the process and when they get to ring size, how do you know what her ring size is? And one person's response was, oh, that's easy. It's the same as her shoe size. If you don't know, there is no correlation between ring size and shoe size. Possibly there might have been that coincidence, but needless to say, the folks in the observation room thought that was hilarious, which again, that's why it was really good. The observation room was a couple of conference rooms away because you really don't want them, the participant to hear a lot of laughter and sometimes they will say funny things and think that that might be related to them. So observation room, farther away, good idea. Anyhow, so while on the face of it it was like, this person said this kind of wacky thing, isn't that abusing? When you step back a bit and say, well, what does that mean? What that demonstrates is, we have a really long hill to climb to make sure that people understand this particular concept and how to be effective, right? You can imagine you've planned out this surprise proposal, you've got everything right, you've ordered the ring and you go to put it on and it's either too big and it falls off or it's too small and you can't, I mean, come on, that was kind of, it's probably not the moment people are envisioning, right? So some of these things are pretty important. And again, we wanna make sure that we understand what we need to do. So what they say is true to them. Take it and learn from it. When you're asking questions, you wanna avoid hypothetical questions. They're unreliable. Observing action is the best. Watch people do what they do. That's gonna be the most reliable thing. This next is have them recall something they've done in the past. They will leave things out sometimes intentionally and sometimes unintentionally. Sometimes we have an image of ourselves that we wanna project and so our brain will filter so we maintain that image. That's why observation is best, but recalled will give us some important parts of the story. And then lastly, we get to imagine what you would do in this case, but if you're not in that scenario or you haven't been, it's hard to be, it's hard to get reliable data in that case. Better to say, how have you solved this problem in the past? So I always think of the introduction of like 3D TVs. And because they were really touted like, oh, this amazing technology, 3D TVs. And I can only imagine they must have had a lot of sessions. They brought people in and they sat them down. They said, isn't this amazing? And a lot of people went, oh my God, these are amazing. And what they probably didn't say is, well, tell me about the last time you bought a TV. And because they probably would have found out as well. Yeah, well, a few years later, HD TVs had just come out. So I'd upgraded my standard definition to a high definition. And well, how long was that take? Well, I had that old TV for seven or eight years. All right, well, if the last, you just got the last one three years ago, how likely are they to upgrade again? So again, you start to understand how someone has solved the problem in the past. And that's setting aside that 3D makes a lot of people dizzy. But anyway, how people have solved the problem in the past gives you a much better sense of how they might solve it in the future and how your solution might fit into that. Next tip is to follow the thread. You're gonna have your protocol. You're gonna have your questions you wanna ask. And you don't wanna just kind of boom, boom, boom, boom, go through the questions. The goal is not to get to the end. The goal is to learn. So when someone says something that doesn't quite make sense, ask more questions about that. Those can be those unknown unknowns that lead to different insights. I mean, sometimes you end up with an idiosyncratic participant who just has a different view on the world or whose context is just different from everyone else's. That's fine, you can still learn from that. But sometimes someone says something that has kind of profoundly changes how you look at the problem. And that can lead to some really interesting solutions. So this is why I like moderated testing versus the unmoderated and just putting something out there and letting people go through it again. That's why that's more validation. You only get the answers to the questions you ask. Same problem with surveys. So when someone says something interesting, you can't follow up on it. So be curious and that's why we have multiple participants. So if we don't quite get to every question with every participant because we followed some interesting threads, you'll still get the answers you need because you have enough people that you're talking to. All right, moving on. Echo, boomerang and mobile. These are three questioning techniques that can be really helpful. A conversation is kind of like playing catch, right? We throw the ball back and forth and that's what keeps the conversation going. But as a research facilitator, I don't wanna add content to the conversation because that has the potential to introduce bias. And that's what I wanna remove or I wanna eliminate my bias. I don't want any of my bias into the conversation. So these are the three ways to continue the back and forth without adding content. So Echo is basically saying the same thing back to them. So someone says, so what I'm gonna do here is I'm gonna be looking at this data to see if this thing is true. And you can say, oh, so what you'd be doing is looking at that data to see if that thing is true. And they go, yeah, yeah, yeah. And then they'll continue. So they said something in pause, which indicates it's your turn. You took your turn, but you didn't add anything to the conversation. But you took your turn in the conversation, now it's back to them and they will continue their story. Boomerang is very similar, but it's related to questions. You don't really wanna answer questions. You wanna, because again, that's now adding meaning. So someone says, the comment, what does that button do? Oh, well, this button is gonna do this thing and that thing, no, that's not what you wanna do. That's bias. You're telling them. And now again, you imply there's a right answer. Instead, you send the question back to them. What would you expect that to do? What would you like it to do? What would be useful to you? The Boomerang, the question back. You don't answer it. And the mumble technique sort of inspired by this 70s TV detective, Kojak, who mumbled a lot. But the thing with mumbling is you take advantage of this sort of social phenomena where people become uncomfortable with extended silences or pauses in a conversation. So if you're not sure, especially what to ask next, you can sort of mumble your way through it. So what you're saying is you need, I don't even know what I'm gonna ask, but you just kind of slowly and maybe a little awkwardly start to form your question. And it's gonna be weird and they're gonna fill in those gaps. Try it, it's brilliant. So echo and repeat back what they said, don't add. Boomerang, a question back to someone and mumble to kind of keep someone off talking, especially if you're not sure what kind of follow-up question you wanna ask. Don't give people instructions. Your job is to learn not to teach. This is really hard for product people because it's, you know, product people have created the product. They know it really well and they want to show someone. So if you see someone doing something that could be done better, there's this impetus to jump in and say, oh, let me show you a better way to do that. But again, now you're creating that impression that there's a right answer. So just help them understand, ask questions to understand why they're doing what they're doing. And then at the end, you can say something like, hey, you know, I noticed that you were trying to accomplish this. There's another way in our product to do that. Would you be interested in me showing you? And then you can do it then. You've already learned and then you can offer some suggestions. The no answer is like, what would you expect? This kind of gets back to the Boomerang. Don't answer their questions. And kind of the last one is don't make excuses. And I find this to be most common either with startups or anytime that you're testing a prototype that is not completely built or something where there's bugs. People start going, oh, I'm really sorry. We meant to get to this. We haven't done that. Again, if something doesn't work in your prototype that you're testing, it's a great opportunity to ask questions. Someone goes, I'd click on that. And they want to click on that. And it doesn't do anything. Maybe that wasn't in the flow that you're trying to test. You'll say, oh, I'm sorry. We didn't build that yet. You can say, oh, what would you expect that to do? What would you expect to happen when you click that? How would that be valuable to you? And you can ask those questions, understand what the person needs, and then you can redirect them back to what you're trying to learn. Those are the main tips. The last one is not really a tip. It's just like, we should record everything. Always inform people first, but even with a note taker, you sometimes just end up with notes that you go back on and you're like, I don't know what this means. So at least you have a recording that you can refer to. So I always tell people, I'm gonna record in 25 years. I've never had someone say no. I had one person in 25 years who asked me to delete their recording after two months, which I did. I agreed to that. By then we had already had everything needed. I was happy to delete the recording, but no one else has ever objected. I do clarify that it is only used for internal purposes and will never be made public. All right. So those are my tips. We got about what, 15 minutes left, 14, 16. So we wanna try this out a little bit. And it's gonna be, we're finally gonna do one round. It's gonna be fast. And so you're not gonna get this right. But what we're gonna do is we're gonna break into groups. I think we'll do groups of four. How many folks do we have? We've got 12, oh, we've got 12. Maybe two to three groups, whatever makes sense here. So pick one person as a facilitator. One person's gonna be the participant and one person's going to be the notetaker. Or in this case, I say in quotes, usually the notetaker would be taking notes on things the participant said. Since we're not really testing something, we don't really care about what the participant's saying. What I want you to do is observe how the facilitator is asking questions so you can give them feedback. So you're gonna be more taking notes on the facilitator. So if we're at groups of three or four, you can have multiple, multiple notetakers. What I want you to do is what we're doing, getting the breakouts is just bring up, I believe it's housing.com, is understand as a housing site in India. And so that as a participant, your job is you're searching for a place to live because everyone's done it. And so if you're the facilitator, ask them a question or two about what they're looking for, what they would wanna do, and then just try and get them to do some initial searching through using that site, ask some questions, try to keep in mind some of the things that we're talking about. We'll probably do about five minutes. If you wanna screenshot this, these are the list of tips, if you wanna have those as a reminder. So I'm gonna give you five minutes once we do the breakouts to have the conversation. And then we'll do a broadcast just to let you know that that time is up. And then at what I want for another like three minutes or so is for the people who are playing in the note-taker role to give feedback on the facilitator. And then we'll bring everyone back. And if anyone has any other questions for me or on user research in general, we'll take those questions. All right, any questions before we start if you wanna unmute and ask one about what I just laid out? Anyone have any questions about either what I went over, user research in general? Hey Jeremy, yeah. Thank you so much for the session. Yeah, it was quite interesting. And it was real. When I said real, I can't read it in a book. Actually, it's like a real time. So I had a question in my mind like, how do we master it? Because I'm sure you have collected it over a period of experience. Like so it's a several maybe decades of experience you have collected it and it is working for you. Now a lemon like me, when I would like to really explore this, what are the tips you recommend to start with? Sure. It's actually easier than you'd think, but it does take practice. But there's lots of ways you can practice at the professional level. If you're recording your sessions, you may want to listen to your sessions twice. The first time you might be taking notes and going back and go, did we learn what we intended to learn from this? And that's your goal from a business perspective. The second time you want to go back and listen to it, you want to be thinking, how to evaluate yourself as a facilitator? What did you do well? And how would you facilitate that differently? Like, oh, you know, I asked that closed-ended question there, I didn't follow it up. What open-ended question would I ask? Okay, I would ask this thing if I would do that again. Or, hmm, you know what, they said something interesting. I didn't follow up on that. Okay, that'll start to create those triggers so that you can, you know, you'll be better prepared for future conversations. Now, the other thing that, frankly, you can also practice this is every day because the not-so-strange truth in life is that people will think you're a much better conversation list the less talking you do. Because people love talking about themselves. So if you give someone the chance to talk about themselves, they'll love it. So you can use all these techniques. When you're just talking to a friend, family, and you're asking them about their day or something that they care about, think about, well, how do I go deeper in this? How do I ask more interesting questions? How do I, you know, use these techniques to get them talking about this thing that they're interested in? Because frankly, most of us, when we're having conversations, we're only half listening and we're really thinking about, what am I gonna say next? And so when you're thinking, when you're just engaging with, again, friends, family, colleagues, whatever, you can put that aside and go, I don't need to contribute to this conversation. How do I learn the most about this other person that I'm engaged with? Also really hopefully a conference is, you know, when you get a chance to meet other participants, then you want to learn about their professional experience, all kinds of opportunities to practice this. Mm, thank you, thank you. Yeah, it's only that silent sometime. As you told me, the silent is the uncomfort. Yeah, the silent one is a little bit harder from a social perspective, but you'll still find people will fill that gap. Again, we love talking about ourselves. So given the opportunity, we'll do it. Most of us, not everyone. So Jeremy, like nowadays, like as you have rightly noted down also, that most of the time we assume that, yes, I know about my user, I know about my customer. And even if we start interacting also, like as you rightly said, like we are busy in actually asking the questions and collecting the surface level information. Now to go deeper, to really understand what they're supposed to tell, but you actually told that like, I want to hear what actually they want to express that about the product. So is this like only through that opening the questionings or allowing them to space and talk, like is that the technique? Let me see if I understand you're asking, is the only way to get that insight through the open-ended questioning? Yeah, the deeper into that customer expectation, like how do we go deeper when they do not express, when they do not tell, when they are not even comfortable also to spend time with us? Yes, so spending, the spending time is an interesting one. But there are different ways to do that. In the longer version of this, I talk a little bit about recruiting participants. And sometimes you're gonna pay people, right? You're gonna offer them an honorarium to spend time with you. Now, depending on how hard it is to get that type of person, you may have to pay them more. A more common profile, you will have to pay less because there's more people out there. But if you wanted like, I'm looking for orthopedic surgeons, well one, they're highly compensated and there's not a lot of them, so you're gonna be paying more for their time. Now, that being said, I think you can often get away not paying people at all by appealing to ego. Like if you approach people to say, you know, look, we're building this product for people like you, and we think that you would offer some really unique insights that would help us make this product better for you and people like you. So you're doing a couple of things there, right? One, you're telling the person you're unique and special, which is true. And they will give you some great insights that might really help. But everyone likes to think of themselves as, of course, I'm very insightful. I have great things to share and you're validating that. So yeah, I'd love to talk to you about that and share my brilliant insights. But the, and the second part of that is, if I use that product, I certainly wanna make it better for me, right? If I give you feedback, it's more likely that it's gonna be better for me. So there's a bit of self-interest there. So sometimes you can appeal to that and not have to pay people at all. And in fact, I sometimes think you can get better feedback that way. So there's just different ways you can approach people and you have to kind of make it about them. Why it benefits them to talk to you, not why it benefits you. Like, hi, I need to spend time with you because I need to get my job done because my boss is on me and I need to produce this thing by this. Like, I don't care, that's your problem, not mine. But if you can make it about why I should care, then I'm more likely to say yes. Thank you so much. If you, one thing I will share, if you wanna get in touch, if you have any questions, you can reach me at J-E-R at m-s-m dot works. That's email on Twitter at Sonark, S-O-N-A-R-C. And I also have a podcast called Saving UX and that's at S-U-X for Saving UX, S-U-X dot live. So if you wanna reach out or find more stuff that I'm up to, those are the three ways to do it.