 and share my screen. Okay, got it. Well, good afternoon everyone. While Margaret is getting ready for her workshop, let me welcome all of you to the Reporting Qualitative Research Workshop, the session conducted by Margaret Roller. I am Sue Boffman from ARL and delighted to see you and glad that you're here with us this afternoon. As you know, many of you have attended some of our other workshops. We began this series of qualitative research workshops and quantitative research workshops in the beginning of 2021. And it's hard to believe that we have reached the end of our series. But as you know, if you have attended these workshops, these are part of our Research Library Impact Framework initiative. And we've been delighted to have Margaret and Kevin Fommelot work with us in presenting these sessions. So as is our practice, we are recording this session today. And we will share the recording and other documentation like the slides with everyone following today's session. I know we have a number of people who would like to have the materials and weren't able to join us. So we'll be sure to get them to those colleagues as well. And again, as we practice this with all of our sessions, if you have colleagues who weren't able to be with us or colleagues in your library who might be interested in this information, please do share the materials. We're happy to have you do that. So with that, Margaret, I'm going to turn things over to you. Thank you for being here, all of you. Thank you, Margaret, for conducting our workshop today. Okay, great. It's good to be here. And thank you all for being here. I'm looking forward to this workshop. As Sue said, the final workshop in our series. This one, as you know, is on reporting qualitative research. Earlier, we had workshops on the in-depth interview method, on the focus group method and analysis and now on reporting. Let me hang on. Here we go. Jess has an overview. And let me say, for any of you who were in earlier workshops of mine, you're going to go, oh, my gosh, she's going to talk about what is qualitative research again. And yes, I am. And that's because I start every session I do anywhere about qualitative research about talking about what is qualitative research. Because it's so important to understanding what it is we're going to talk about. In this case, we're going to talk about reporting. So I am going to start there again. So it's going to look familiar. I'm going to briefly talk about analysis because that's going to lead in very nicely to talk about reporting and kind of the overall goals and components or sections of reporting. And then kind of most of our discussion will be a discussion about the components or sections of the report and using examples from my own work. So what is qualitative research? And again, for any of you who are in other workshops of mine, you have seen this slide before. But I think it's important. I know it's important to remember what qualitative research is. It's not quantitative research. We are going beyond the obvious and the expedient. And everything is about context and interconnections and how one question is related to another question and all about the interrelationships of what it is we're talking to our participants about. They're talking to us about in our objectives. For what it's worth, I started out as a quantitative researcher. But I very quickly, when my head was full of survey research, I very quickly wanted to become a researcher. And I knew that to be a researcher that I needed to understand this thing called qualitative research. And that's when I kind of set out on my path of exploring qualitative research and just became passionate about it. But in doing so, what I also became passionate about is that to be a qualitative researcher, you need to embrace it on its own terms. Not in quantitative terms or survey research terms, but in the fact it is what it is. And it is what you see there. And these ten unique attributes, which again, if you've been in other workshops, you've seen this before, you've seen me highlight what you see on the right there, the importance of context and meaning. And I'm showing it to you again, and I'm having this discussion with you again, because these are central. These are central ideas and play a central role in how we report qualitative research. So as we go about the process of reporting qualitative research, I'm encouraging you to keep these these central unique attributes, aspects of qualitative research in mind as you go about it. Because you will need them. Analysis. In the analysis workshop, we talked, of course, about what? About the same kinds of ideas, about the underlying meaning of our data, about the contextual, what I call the contextual meaning of the words, not just the words themselves, whether it be, you know, what does it really mean when someone talks about library support or service or impact, and how actually that can have a different meaning among different types of participants in different types of situations and for different types of objectives. So, you know, in a nutshell and put more simply, the qualitative data and the analysis of qualitative data is a kind of a nonlinear process that's focused on the latent, not just the manifest content and is holistic. So needless to say, we're not because we're not dealing with discrete bits of data, you know, our analysis doesn't follow a straight line from point A to point B. And these are the points that I part of the what I was talking about in the again in the analysis workshop. And I bring it up now because it's because this leads very nicely into talking about read the reporting of qualitative research. And as I stated in the in the analysis workshop, and as I highlight here, the goal in our analysis is to construct a narrative from which themes and patterns can be identified, and from which we can draw interpretations. In the analysis workshop, we went through I talked about the kind of the eight basic steps that lead to identifying categories or what I call buckets, as well as themes, and then drawing interpretations from the data. Now, it's these categories and themes that you've identified in your analysis that are going to be used to communicate your narrative as it relates to the research objectives. Now we've entered the realm of reporting qualitative research and the overall goals. And the overall goal, just as we talked about in the analysis workshop, is to communicate a narrative. And importantly, and I think it's true in survey research too, but importantly, not not to report everything you heard. And I'll talk about this again, but it's the idea that in your analysis, and then now into your reporting, you're not there to regurgitate everything you've heard in your qualitative research, but to draw on the the categories and themes that you derived in your analysis. And to offer a convey a really rich understanding based on that analysis that really, really conveys again, conveys human experiences, the attitudes and behavior as relates again to your research objectives and questions. Needless to say, it should be your reporting should be simple, user friendly, convincing, but again, but user friendly and simple to use. So those are some basic goals. The basic components or sections of a qualitative research report. Let me say from the get go that this can be and should be thought of fairly with some flexibility. Okay, well, let me say a couple of things here. First of all, okay, first of all, what I'm going to be talking about today. And when you look at these sections so that you know what I'm thinking is that I'm talking about a text document, text reporting, that's going to be used internally with your stakeholders. For instance, I'm not talking about writing a journal article. I'm not talking about writing a PowerPoint presentation of your research. Now, having said that, you could use any aspect of what I'm going to talk about today for a PowerPoint presentation, or for obviously for a journal article. But where I'm coming from and what I'm going to be talking about today is in the context of an internal document for your stakeholders. So these are the basic components of your report. This is fairly flexible because it will depend on your audience and what you need to convey and not convey and that kind of thing. And we'll talk about that. So for instance, I'm going to talk in a minute about executive summary. And I'll mention in a minute that I don't always use an executive summary and you really need to think about your audience before you use an executive summary. Going down the list there, the implications of recommendations, again, which I'll talk about when I get to that part, may really just be a summary. It may be sometimes I call it opportunities. Sometimes I call it something else. So this is fairly flexible, again, depending on your audience and your particular situation. So the summary. I think it's very straightforward and I think you folks already know this. But my summaries are typically one to five pages, a paragraph on backgrounds and objectives. Something pretty short on data collection, unfortunately. And that's one reason I have sometimes I don't use executive summaries. Two or more paragraphs on findings, again, depending on objectives, size of study, your audience, that kind of thing. And then probably a paragraph that's summing it all up or is what I've called the implications and recommendations section. You just really have to judge, again, your audience as to whether or not you're going to have an executive summary. And what part of what I mean here is that when I have used an executive summary, I've been saddened to learn that it's not surprising, I know, but a little bit sad to learn that some of my clients or stakeholders or end users would read the executive summary. And that's it. And that's all they would read. And that as a research person would get me very, very concerned. And I found that if I could eliminate the summary or do it in some other way, that they would be forced to read the report. And then they actually understood all of the nuances and had true of the research and had true appreciation of what it is we did in the research. So and how we got to the implications, recommendations that we did. So background objectives. I just got one slide in here. This is one example from a study I did for Michigan State. It connected to asynchronous online groups with faculty concerning outreach and engagement. And it's pretty straightforward. I typically have, you know, it's typically just two, maybe three paragraphs. With the first paragraph being the background of, you know, why are we here? And what are we doing? And what's the, you know, and then the second paragraph talking about what it is, are the primary objectives. And that kind of thing. And notice that at the end of that second paragraph, I've also stated that an objective was to understand participants' recommendations for new approaches to outreach and engagement at MSU. So in other words, I'm including, I'm always including in this section of the report something actionable that we hope is going to come out of this research. I think that's very, very important. I'm going to now start talking about research design, the next section of the report. Before I do, are there any questions? I'm just kind of going to stop here for a second and ask if there's any questions or, okay, research design. Before I get into the intricacies of research design, let me just make this statement up front, which has to do, as you see, about transparency and the idea of providing sufficient details so that your users of the research and the readers of your report have enough information to determine for themselves whether or how the study parameters compare to or can be applied to similar contexts. This might be similar participants, similar places, locations, similar times, or that kind of thing. It could be, you could be looking at how the study, the user of it could be looking at how the study is the same or different than earlier research with the same population group, with the same segment of the population, and how the results the same or different. They might be looking at how the design itself might be used with other target segments of the population. So the, again, the user of the reader of your research needs to have enough information to be able to judge for themselves whether or not they can make those judgments. And it's what I refer to the term transferability, which is the term we use in qualitative research to talk about the ability to transfer what's gone on in one study to the context of another study. Now, again, once again, you could have to judge for yourself how much transparency you can build into your report. Transparency is a sign of quality. It's a quality approach to your reporting to be transparent. But the practical nature is, bottom line is that you have to judge your audience, you have to judge how much time and resources you have to provide that kind of information, and how much of that information you can provide. And that's going to probably change for each of you. Now, if you're doing journal writing, of course, it's going to be so that's probably one example. So if you're doing journal writing, that's going to be a different animal than if you're doing internal reporting for your stakeholders. So again, just have to judge your audience and how transparent you can obviously be. I'm just encouraging you to be as transparent as possible. Okay, research design at the very most basic level is you're going to report on the what, how, when, and who. And notice in the how, and again, this is the MSU study with the two groups of faculty. You're going to talk about, you know, how you did the recruiting and who you recruited. And oh, yes, we had two groups and one was, we, one was designated the high engagement group. These are faculty members who were highly engaged in outreach and engagement. And then we had what we call the low engagement group. Be sure and include when you did this research and provide dates and who you did it with in terms of number of participants. And not only number of participants, I encourage you also to include who in this case, it was me, but whoever it is who developed the guide who designed the research, who actually conducted the groups, did the analysis and the reporting, and that kind of thing. Also notice that the very, very last thing I say on that you can see on this slide is that the discussion guide is in the appendix for reference. So that should also be in there. And indeed, whatever guide you've used, whether moderator guide, interview guide, observation guide should be in the appendix to your report. Now, when reporting participants, when I, when I have, which has happened, and these are two examples from my work, when I have participants that are running across multiple segments, and, and I know that it is important to my client or sponsor the research that they see the breakdown of participants by these multiple second segments, I will put something like this in my report. The top table matrix you see there is for an in-depth study, in-depth interview study, I do 30 interviews. And I had, I very purposely, when I, when I recruited and conducted this research, I made sure I had a good mix of decision makers across different types of senior housing and healthcare communities. In terms of type going, as you can see, going across in the columns, as well as the number of communities that these organizations had under their umbrella, so to speak. That was important to the client, to the sponsor. It was important to me when I actually conducted the research to make sure I had that mix. So therefore, it was very important that I included in the report, which I did, so that they could see how the 30 interviews broke out. The bottom part of that screen you're looking at is a study I did with, a focus group study I did with, with consumers. And you can see, again, I had two cities and in each city I conducted two groups and they were very distinct groups in how and why they were recruited. And again, it was important for the client that they see that we had this kind of mix. So that's how, how I did it. Now, what I've shown you thus far about this aspect of research design is pretty bare bones. And I, I get that I understand that, again, depending on your audience and depending on how much, if you have the opportunities, what I want to say, if you have the opportunity to elaborate, let's say on your data collection, if you're going to have the opportunity to elaborate on, let's say, where you conducted the research or how it is you developed the guide, the way you did, or other aspects of your participants, maybe going beyond demographics, for instance. I highly encourage you to do so. And I would include in that analysis, because analysis, there are a lot of users, readers of our quality of research report that, and I'm not talking about journal articles now, I'm talking about internal stakeholder holders. And there's a bunch of them out there that really, you know, I don't know why they're not really keen on reading a lot about analysis, but the, to the extent that you can get away with it, I encourage you to include something about analysis. So for instance, including something about your data format, did you have audio recordings or video recordings or transcripts? And if you had transcripts, who did the transcriptions? And did you have rules for the transcriptionist when that person did the transcriptions? You might, and then what about the overall process and the steps that you went through, and maybe even any steps that you may have skipped for whatever reasons, for some maybe very good reasons? Maybe something about the coders and how did you check for accuracy? Did you check for accuracy verification? Did you do any verification? And I, if you can, and again, you know, this is not a journal article I'm talking about, but if you can also include something that some reflections on the process, on the analysis. So for instance, you know, if you did have to skip any steps in the analysis process, what do you think, do you think that made any difference or had any impact on the analysis and on the outcomes of your analysis? So anything that you can add about any of these areas having to do with analysis is a good thing. But again, you have to judge your audience and you have to judge your resources to do so and that kind of thing. Do I have a question? Oh, okay, no problem. Thank you for coming, Charlotte. Oops. Okay, so let's talk about preface, which, you know, is kind of a cautionary statement. And I'm just going to give you a couple of examples of, and again, this will partly depend on your audience. My users and readers of the research are often people who are very comfortable and well versed in survey research, but not so much qualitative research. And I have, many times in my career, I have included some kind of statement upfront before I get to research findings so that they understand what it is they're going to be looking at and what it is we're going to be talking about when we get to research findings. So this is for a consumer study I did, you know, for focus groups of consumers. And I'm simply here emphasizing the difference between qualitative and quantitative research, while it also emphasizing the contribution that that qualitative makes to survey research. In this example, this is for a GuideStar study, and I've used GuideStar before. And you're going to hear I use examples from this study kind of throughout our session today. But the GuideStar study is unique in that I conducted 86 interviews. That's a lot of interviews. Now, what's important here is that when I got to writing the preface, I thought, oh my gosh, you know, I can already think of some people who are going to read this report, they're going to read I conducted 86 interviews, and they're going to immediately start start thinking of this as survey research. So I wanted and needed in my mind to do something to kind of head that off and the at the past. So I really want to make sure that they understood what qualitative research is and what I tried to do is is kind of emphasize the unique attributes of qualitative research. Okay, let's talk about creating a narrative. This is this is kind of the most important part of what we're going to be talking about today. What you see here is is the funnel approach to guide development. And if you're in either the in depth interview workshop or the focus group method workshop, you have seen this funnel before. And why am I bringing this up again? I'm bringing this up again, because once again, I'm encouraging you to think about why we even conducted the interview or the discussion, the way we did going broad to narrow, gaining context and background in the first few stages of our guide, so that when we got to stage four, we could use that information to help us understand where the participants are coming from, their lived experiences as it relates to the research objectives, based within the context of their lives, which we've gained from the earlier stages. So I think it's critical to keep that in mind as you are building your narrative in your reporting. So what have I done here? I've kind of flipped it, tossed the funnel on its head. I've kind of flipped it on its head. And now we're starting from the ground up. And we're starting from the ground up by giving the reader the kind of contextual foundational understanding that you learned in your interviews or your focus groups in the first few stages, and using that to be the basis of your narrative as you go through your narrative, as your narrative unfolds. Because from there, you can discuss what you learned related to the relevant concepts and constructs within that basic foundational understanding that is kind of holding this whole, holding this whole thing up. And then from there, you can talk from the, you can get to where you really want to go, which is to talk about the themes you derived from your analysis and that are relevant to the research objectives. It's kind of a logical interpretational narrative that you're giving your reader. So now you're the themes that you're talking about around the research objectives are now being discussed within the foundational understanding and interpretations of the relevant concepts and constructs. I'm going to give you a couple of a few examples from my own work. One is going to be research I did for EPA, the Environmental Protection Agency. One for an energy client, a provider of electricity, and GuideStar again. So let's start with the EPA. And here I conducted focus group discussions with their staff and with behavioral and social scientists concerning what they, what EPA want to know is EPA want to know what behavioral social science research was out there that would be compatible with EPA priorities. Okay. And where I started in the narrative to set the groundwork. Okay. So it goes the groundwork, you know, starting from going from the ground up, where I started my narrative. So I started with having a discussion about the fact that staff and social scientists share the same attitudes. So let's just start there. They share the same attitudes. And one of the things that they share in these attitudes is that behavioral and social science research impacts all areas of environmental policies, not just EPA priorities. In other words, they wouldn't be confined just to EPA priorities. From there, I talked about, had a discussion about the idea that just very logically, the idea that behavioral social science research is thought about on a more profound scale where it's more comprehensive and talks about, and these researchers were talking about more comprehensive concepts and think about more comprehensive concepts, such as infrastructure and that kind of thing. And from there, that kind of set the stage for understanding their recommendations for EPA priorities on specific topics. So now, when we got to the top of this pyramid, we understood that when they talked about more comprehensive ideas, such as, you know, how do we engage the public? Why isn't this their compliance? We had a very good grounding and understanding of where that came from. Another example is the energy company I mentioned, provider of electricity. And here they were interested that the whole purpose of these focus groups, and there was a bunch of them. And the whole purpose was to understand customers' reactions to a green tariff idea concept, which is basically the idea of charging them a monthly fee, which a monthly cost, which would go to renewable energy. So we started, I started my narrative by making it clear that customers, so I started my narrative with a discussion of what customers, the customer's interest in the environment period. And what I learned is that it was lukewarm. I would say it was fairly lukewarm. There was some interest. They did some recycling, but it was pretty lukewarm. They had some understanding and noticed, and they did notice that some companies were better than others in helping the environment, but they had noticed any energy companies that were helping the environment. When we got to talking specifically about their energy use and specifically about their use of electricity, there was no awareness. And it's in the narrative, it started to make sense because I just said they're kind of lukewarm in their interest about the environment, and now we talk about their electricity use. Well, I'm not aware of it. I do know how much I pay, though. You're very much aware of the bill. So when I asked very specifically about terms, terminology used in the green tear concept, they weren't, they couldn't define them for me. They weren't aware of them. They didn't know what I was talking about. Again, it was starting to all make sense as part of the narrative. So when we got to actually getting reactions to the concept itself, it was really no surprise that they couldn't really see a benefit to the concept, and their only concern was that it was going to cost them money. So Guy Star is a different animal, and I understand that, but that's one reason I throw it out there because it is a different animal because it's 86 interviews. It was a lot of interviews. And what we are doing here is I was talking to private foundations, public charities, corporate giving people, and the main objective was to gain their reactions to concepts, to products and services that Guy Star was thinking about, investigating. One of the things I learned right away in this research is that it was not useful to think about these 86 people I interviewed or these 86 organizations that I interviewed in terms of private foundation, public charity, corporate giving, and those other groups that Guy Star had put them in. That didn't tell us very much because it didn't tell us what we really, it didn't tell us really how they were using Guy Star. By the way, Guy Star is like this massive provider of online information on nonprofit organizations using primarily 990 tax information. But what was useful is in terms of thinking about how they think is how they were actually using Guy Star, what kinds of information they were using from Guy Star and how they were using it. So one of the first things I did is I created user segments and I created three user segments based on how they were using Guy Star information. So that's where I started my narrative. My narrative started with here are the three need-based user segments that I identified and then I defined them and described them in detail. From there I went up and I talked about, from now after I did that, I could talk about how each one of those user segments finds advantages in using Guy Star as well as other information providers. And then from there we could talk about, which we did in the report, talk about user segments reactions to these various products and services that I asked them to react to. So, and by the time we got to that, it all made sense because they understood these user segments, they understood what they considered advantages of this information. So when we got to these reactions it all kind of made sense. Before I, I'm going to just stop here again and ask if there's any questions because I'm going to move from here to ways, things that we can do to kind of help discuss the narrative. Okay, all right. Hang on. Okay, good. Okay, thanks Nancy. So here just, I'm just going to throw out a few ideas and by the way, I'm going to do this and at the end I'm going to ask you for your ideas of things that you folks have done that you have found have worked well. Here are my ideas. Here are things that I have done that I have found that have worked very well for the sponsors and users of my research. One is headlines, meaning that the headline is actually explaining what it is I'm going to be talking about and it's helping to tell the narrative and actually if you had just just went through the report reading the headline, you could pretty much start reading a whole kind of understanding the narrative that I'm getting at throughout the whole report. But anyway, so the top example you see on the top of the screen is again the EPA study. It's what I've already mentioned to you. I had one headline that says as you can see more similarities and differences and just talks about how the EPA staff and the DC meeting just refers to the behavior and social scientists that I talked to in DC and how they kind of think alike. And then I had another headline for another example of a headline of how their priorities transcend go way beyond what we wanted to talk about in these groups as far as EPA. You know, EPA had their own little set priorities, but as I mentioned earlier the participants I talked to said, no man, we need to go way beyond that. And they were all in agreement with that. The bottom part of your screen is the energy study I just mentioned to you. And again, I had one headline that just said, as you can see, energy companies could do more. Another one that said, you know, customers, electricity, consumption, you know, it doesn't really register. And as I mentioned that to you for, and you notice too how I used in that very bottom example where I just doesn't really register is a quote. And indeed, you can use quotes in headlines can be very effective. You have to think long and hard about what you're going to use, but it can be, it can be very effective. The top example, example of the kind of top of the screen is again for GuideStar. And then the bottom of the screen is an example from a consumer study I did. Again, you have to just be very thoughtful about what it is you're going to use. Which leads me nicely into this next slide which says, yes, use quotes, but less is more. You know, using quotes can be very effective. And I'm a big, big fan of using quotes and moving your reader closer to your narrative and to the participant's actual experience. So that is what's really great about quotes. However, less is more. You know, it may take more time, and it will take more time, I'm telling you now, it will take more time to understand what it is that you, what quote you want to use or, and to sit that what you don't want to use. But that is much preferred than giving your readers a long list of quotes. That is not their job. Their job, the users and readers of your job, of your research, it is not their job to sift through all your quotes and kind of figure out, you know, what is it? Should I be paying more attention to than something else? That is, you know, better or worse is your job. That is the job of part of the job of being the researcher on the study and doing the analysis and doing the reporting is to figure out what quote or quotes say what you need to say and then get out and then leave the rest behind. So the top example in this slide, again, is from the Guide to Her Study. I picked two quotes that I thought really captured what it is I was trying to say. The bottom part of that slide is the MSU study, the Outreach and Engagement Study I mentioned to you earlier. And what I did here is I was giving examples, quote examples on my, I was talking about the Lang Grant mission of MSU. And what I did here is I used just one quote for, if you remember, I did two group discussions, two group discussions with faculty. One was the High Engagement Group and one was the Low Engagement Group. And what I did is I picked one quote from the High Engagement Group and one quote from the Low Engagement Group. You'll notice too, at the bottom of the screen I used, for this MSU study, I used bold text. And indeed bold text can be a very effective way of kind of offsetting and highlighting important takeaways from your research, as well as kind of supporting the headline to create, to help tell your narrative. So it really helps that you've got, you know, you've got, you've got your headline and then you've got this bold text and they complement each other and it just kind of reinforcing helping to talk, helping to tell your narrative. Let's talk about language for a second. I want to discourage you folks from, from stating something like, which I made up, this has not come from, from anybody I know, except my head, six out of nine faculty participants favor an open access publication model, or five out of 10 staff participants prefer working remotely. I really discourage you to use that kind of, that kind of structure to your sentences and that kind of, of talking in your, in your reporting. For the reasons that I give here, it really flies in the face of what it means to conduct and analyze qualitative research. You know, it goes back to my, my early slide that I use over and over again, which is what is qualitative research? Go back there and go back to the unique attributes of qualitative research and you realize that this really contradicts why we conducted qualitative research in the first place. It also, as I state here, it encourages your readers to think quantitatively and what I worry about is not just think quantitatively about maybe that particular statement that you're making in the report, but maybe it flavors how they think about the entire report and the entire piece of research you did. So I, I would, I worry about that. And, and, and in a similar way, it encourages your readers to do the math. So, oh, okay, six out of nine faculty participants. Let me see six out of nine. That's two-thirds. Okay. So two-thirds of the faculty participants favor an open access publication model. So there are other things that you can do in terms of language. You know, here are just words. These are just words that I put in alphabetical order. They don't mean, the order doesn't mean anything. These are just words that I often use in my reports, as you can see. And the idea, so all I'm trying to convey here is there's, there's non-numerical ways to convey a sense of degree when you want to convey a sense of degree of something. Okay. You can also do it with visualization. And here's an example from the GuideStar study again. Remember, I was asking participants to give me reactions to a whole bunch of products and services. And I thought, well, how am I going to do that? That just, you know, like, they'll just get it. You know, they don't have to get, you know, I don't want any numbers or anything like that. I just want to get the essence of what I learned about the importance, the popularity of these products and services. And so I just came up with this very simple way of doing it. And I know from their reaction that it worked, that they immediately could tell what was the most popular and what was least popular. More examples of visualization. Here's another one. And this is from an energy study. This has nothing to do with the Green Tariff concept I mentioned a minute ago. This had to do with attitudes towards the monthly bill. And what I wanted to convey here, again, I want to go now, how am I going to do this? And this is what I came up with. And here was my dilemma. How was I going? I came up with, in my analysis, it's like, and this I'm sure is true of you. It's true of me every single time. At some point, the light bulb goes on and the light bulb went on. And I understood that what I had heard in my group discussions with customers of this energy client really revolved in some shape or form to this idea that electrical service is a cost of living expense, just like rent and food and health care and those kinds of things. That is how my participants were thinking about their use of electricity. Now, once I understood that, then everything else seemed to fall into place for me. Then I understood how that attitude, belief, was influencing their attitudes towards their service and their utility bills and their usage. So it was really fundamental to everything else. So again, I was thinking, well, how can I just convey that instead of a bunch of text? And this is how I did it. And again, I got the kind of feedback that says to me, it did the trick. Here's another example of visualization. For a bank client, I was talking to CFOs at private and public universities and colleges and wanted to convey board involvement, as you can see, board involvement with financial decisions. And just, I didn't need to be rocket science. I didn't need, it didn't require a lot of text. It simply required, I just needed to convey where they kind of fell in their involvement in financial decisions. And that's how I depicted it. Now, this is also for a bank client, but this is for senior housing and healthcare. This, what you're looking at here, was extremely effective for its purpose, which was to communicate the theme relationships. So I did, this was an in-depth interview study. I did a bunch of these interviews. And one of the themes I came up with had to do with relationships. Well, for goodness sake, if I went back to the client and said, well, guess what? Relationships is really important. They'd say, yeah, we know that. So what I did is I, what I did is what you see there. And what I'm showing them here is the theme relationships. And then the four categories or buckets that make up relationships. So what is relationships? Relationships is ease of process, relationships is resolving problems, relationships is flexibility, relationships is responsiveness. And as you see for each one, and then how do I say, and then, okay, well, I need to convey, you know, what does that mean? And what does flexibility mean? So for each one, I found a quote from a participant for each one that I felt really conveyed what is meant by that particular category as it's fitting under the theme of relationships. And so anyway, so that's what I put into my report. And I know for a fact that this provided actionable information for the client. They just, they just, they got it, they just got it. And I'm happy they did because I was really, once again, wondering how am I going to convey, how is I going to do this? That would be understandable and, and to the effect and be effective in creating some action on their part. And it worked. Okay, I'm going to, I'm going to just end today by showing you just a few matrices. Now matrices, I know they're not, not too sexy, they're not too, I know, but they become very important when, as I mentioned earlier with respect to reporting participants, they, it becomes really important when you have just a lot of information that you really do need to have in that report. And because that is, because it is key to the objective. And so the client and your, your sponsor is expecting it. In this case, for an energy client, we were looking at news, several three or four different designs of a newsletter with customers, getting customers' reactions. Now, with, with three or four different designs, and, and then asking them, you know, what do you like? What don't you like? How could this design be improved? As you can imagine, there was just, just a lot of, a lot of information there that could have been very confusing. But to help tell the narrative, and to stay on objective, I created a matrix for each one of the newsletter designs. So in my analysis, so what I did to help tell the narrative about reactions to design, in my analysis, I came up with, with three themes that really governed how people were reacting to these designs. It seemed to be, it's the three themes seem to revolve around simplicity, simplicity, in other words, how simple the design looked, how the degree to which it grabbed their attention, and the degree to which the information was relevant to them. So using those three themes, I then, as you can see, I then in my matrix said, okay, here's what they like about this particular design, this is design Y, this particular design across these three things. This is what they don't like so much. And then this is what they are suggesting for improvements. Here's the here's the Geistar study again. And here I wanted again, reactions, as I've mentioned before, one of the, the important objectives to the study was gaining participants reactions to a whole bunch of products and services that Geistar might provide in the future. And what was important to me, as the person who came up with, if you recall, these three user segments for Geistar, based on this research, was that the users of this research understood reactions to each one of these ideas by the user segments, because I had to, because there were different reactions than different, it wasn't the same for every user segment. In the, in the example I'm using, that you see here on the screen, these actually were, I mean, this is not a great example in that, in that all the segments preferred these, the ideas you see on the left. But I was able to say, at the very bottom one for the annual report that all segments preferred the annual report, but was especially preferable to what I call the prospect and CS segments or two segments. But then I also need to convey why and why not it was perceived as useful. That's a lot of information. And importantly, and this is very important, importantly, it's a lot of information, but it's a lot of information that was key to satisfying the objective of the study. So it had to be in there. Another matrix for guide star, and this one was simply, this was about existing features of their website. And I was just asking, I was asking participants to give me suggestions how they could improve existing features such as their interface. But again, another example, but again, a key objective to the research it had to be in the report. EPA study again. And here's just another example, a large amount of information. And here I'm listing, you know, questions and issues associated with each of the EPA priorities, in this case, individual differences, behavior and attitude change. Again, but this was key. This is what EPA expected me to come back with this. So here it is. The last section before the appendix of the report is what I'm calling implications recommendations. Now, as I've already alluded to, well, first of all, it may not be implications and recommendations, it may be opportunities, it may be summary, it may be something else. It may just be recommendations that you haven't discussed anywhere else. But important to keep in mind that there is in qualitative research, there is no clear divide between everything you've talked about in your research findings. And then when you get to this last section of the report, because believe me, you've probably already talked about the implications and maybe recommendations in the findings, which is the body of the report. So you need, you know, you shouldn't go into this section, in other words, thinking that, you know, that there is this clear separation and divide. Because indeed, you may have, and you probably have, and I'm sure you have, already talked about many of the implications as you've gone through the research findings. Here's just an example of something I did for an architectural design firm. Again, this is the Green Tariff Study. And I'm simply highlighting the key areas that they need to think about in developing the Green Tariff Program. And here is, again, this is EPA. In this case, this is not EPA priorities that I'm looking at, but that you're looking at. But the scientists, right, the behavior of social scientists came up with their own priorities. And that is what I talk about at the last section of the report. Okay. Right. Here are some resources, how am I doing on time? Okay. Here's a couple of resources. On the top is my blog, Research Design Review. I've put a link there with the tag reporting. And that will simply take you to all my articles. I've done a number of articles about reporting, qualitative reporting on my blog, including, you'll see, when you, if you do that, you'll also see that I have a, I did a year or so ago, a compilation of articles related to transparency and reporting that you might want to take a look at. On the left is our book. And the reason I put it there is because we talk about reporting in the book, and we talk about transparency. So if you wanted to learn more about that, that might be a good source. And I also put the APA, the Publication Manual, Seventh Edition, which us qualitatively we're so excited that they have finally, for the first time, have a section on reporting standards for qualitative research. Now, of course, they're talking about journal articles, but it's, it's well worth a read. I highly recommend it. Here's my contact information. I also have my office hours link there. If any of you would like to book up some office hours, I'd be happy to talk to you about any of this or any of the prior workshops we did or whatever it is you're working on. Speaking of which, do we have questions? And or I'd love to hear about any of your experiences in reporting qualitative research. Any, any things that you've done in your reporting that has worked, you think particularly well, or maybe not so well, I don't know. Hi, Margaret. This has been really, really useful. And I appreciate it. I was going to ask you about quotes, but you put that in. And I loved what you said about don't make the reader sift through your quotes, that that's your job. I hope you're okay with that, Nancy. I thought of you when I said that. It was good. It was wise words. And, you know, one of the things that we struggle with is, you know, those quotes are so rich, and you become very attached to them. Yeah. And it's hard to, it's, it really is. And it could be hard to make them short. And I know, you know, and this is just the writing, you make it seem very straightforward and clear, but it's really hard. It's hard. It's really tough. You really, and it's kind of consuming. Yeah. Yeah, I was just going to maybe when you're doing it with a group, it's even harder because you have a lot of negotiation. Right. Yes. So one of the things that I have learned is to, you know, set that out in the beginning. Okay, I'm going to write the first draft of the report as team lead, and then you all can weigh in and make sure that it makes sense and agrees with what, you know, it, it, it's correct or your perspective, you know, but, but it just, you can't be doing group writing or taking forever. Yes. Yes. I totally, I totally agree with that. And, and yes, it absolutely is time consuming. And it's, it's just, and, and, and here I'm going to say, oh, it's just about, you just have to block the time and, and believe me, I understand how hard that is. Yeah. Any other questions or anything you'd like to share? Margaret, I think you, you may have, in your presentation, you may have answered everyone's, or looking at two screens, answered everyone's questions. We'll just give everyone another minute in case there are, are some questions for you. Sure. Make it sound easy, like Nancy said. Yeah. Well, that's why there's office hours. Your, your, your years of experience and practice will come in handy. And I do just to pick up on that. I do want to encourage colleagues. I know several colleagues on this call have completed their project or practice brief and submitted the report and it's actually been published. And others are beginning that writing process. And I know we had some colleagues who were here on Tuesday for this workshop, are beginning their writing process. I know their feedback was that this has been very helpful. Great. Good. Any other last questions or comments from Margaret before we say goodbye? Well, with that, Margaret, thank you very much for offering this workshop and all of the workshops in the series that you did over, over the past many months. We really appreciate it and appreciate all of our colleagues for attending the workshops and being a part of these, these great discussions. Thank you very much. Thank you, everyone. Okay. Look forward to seeing you at future meetings for our teams. And stay tuned. We'll, we'll be in touch with them. Plans for an end of initiative celebration. So thank you all for coming. Thank you so much. Thanks everyone. Bye. Bye.