 If any of you have ever taken a writing class in your lives, you will have heard this phrase, show, don't tell, right? It's the number one thing that the especially creative writing teachers to say to their students, comment that you'll see all the time in your assignments or your essays, show, don't tell. I don't want you to tell me about what the character does. I want you to narratively show it, right? This is relevant to me in particular, obviously, because we're talking about data and storytelling, but my PhD is in literature, it's in storytelling. It's in understanding how a story works and how you can convey all sorts of meaning through story. Data is just one of the tools that we can use to tell really good stories. So today we're gonna talk about how, much like any other kind of story, data can be used to reveal a tremendous amount about your work, your impact, your community and that some of the same principles of storytelling when we're using text and words apply to when we're using data. So, yeah, let's jump into it. So just a second in Eli's land acknowledgement, I'm based in Toronto, the first generation settler of Irish descent, the place where I'm in, known as Toronto is covered by Treaty 13, it's the territory of many nations, including the Mississaugas of the Credit, the Anishinaabe, the Chippewa and the Hamasani and the Wendat. It's also now home to many diverse indigenous peoples. And as a person whose home Ireland is still under colonial rule, I deeply support the rights of all colonized people to assert their autonomy in the face of settler states. This is a beautiful image of beadwork by a collaborative of indigenous artists. I just think it's just a stunning piece of art to represent how many different nations this country is a part of. So yeah, just briefly about me, as Eli mentioned, I'm an accidental techie, which is to say that my training is not in technology or data science. I came into this work because I had really great mentors along the way. And because I was just curious about how things could be done as effectively and as sufficiently as possible. And it turns out technology is a really big part of that. So I, as Eli mentioned, I've played every conceivable role in a nonprofit and I now I'm working, I would say nonprofit adjacent. I work as a consultant with small nonprofits. Organizations usually less than a million dollars in size. Those folks who don't necessarily have an IT department and I become a little bit of their IT strategy provider and we work on getting them really technologically enabled. So I've been telling stories about data for a long time. This is just a bunch of various videos about my work with nonprofits to use data and technology better in their work. You can check them out at onyamaclin.com. Also fun tour through my hairstyles over the last decade. It's always fun. So what's on tap today? So first we're gonna talk about what kinds of stories you can tell with good organized data. We're going to talk about why audience matters the most when you're thinking about storytelling, particularly or yeah, particularly when you're using data. How do you apply structure to your data to tell stories? A lot of you have, I would say for the most part unstructured data, okay? And in order to really sense make we have to apply structure to it. So how do we go about doing that? How do we determine what kind of structure to apply to it? And then some practical steps that you can take today to serve for stories in your data. Some really concrete things that you can do right now after this session, if you're feeling motivated to start to unpack the data that you do have to see what potentials there are there to tell stories. And of course also to see what kind of data you might be missing and going forward with to collect. As I go along, please feel free to pop any questions into the chat, especially if there are questions clarifying something that I've just described. Eli will let me know and I'll be happy to provide any insight or additional definition to help clarify. But then yeah, if there are questions of a more substantive type about the content then we'll certainly leave some time for them at the end. So where we're trying to get to in some senses is data that can be accessed by the end user and that they can basically use a self-serve approach in order to answer the questions that they might have about your data. So it's really best served and shared in dashboards that can be manipulated by that end user. For example, I was working with a client who had a fundraising board. So this board was really engaged in fundraising and that board wanted to see the evidence of their efforts and they could be writing emails to the executive director and saying, okay, where are we at? Have we gotten donations? Any new grants from that email that I wrote you? Where are we? And that can be quite challenging for an executive director or a director of development to have to manage those information requests. So instead we thought, what if we can make a dashboard that will reflect that donation and pipeline data in real time and that the board members can just access on an intranet and they can run their own analysis because each of them have their own questions about the effectiveness of their particular effort on that campaign. So I'm just gonna open this up really quickly and just show you what in terms of self-service. This is just an example again of when we think about who the audience is, in this case it's our board members and we think about what kinds of questions that they might have, what kind of stories do they need to tell about their work? Then we can come up with some solutions that help them answer those questions without us necessarily having to be hands-on, okay? So this is obviously a little bit of an exercise in user design, user-centered design and then obviously an exercise in data. So just really nice that if I'm a board member, let's say I'm Eliana and I wanna see, okay, what's been the impact of my effort, right? So now the charts all change so that I can see, okay, there's 38 donors that gave a gift as a result of my efforts. These were the sources. This is the average donation year over year. Let's say I just wanna look at 2019 gifts. I can do so, I can see the value of the gifts that I've made that have been made as a result of my efforts. So this is just an example of one of those kinds of client dashboards that we put together and that really help the end user answer the questions and tell their own stories either to themselves or to perhaps other board members imagine Eliana being able to say to other people on their board, hey, I'm kicking your butt this month, right? Like maybe you need to get on that pipeline building that's in the same way that I am because look at the results, right? These kinds of database tools help people be compelled to be motivated to continue their efforts. Studies have really shown how much, how simple we are as humans in some ways, we really like to see evidence of our effort, right? So if we can use data to show people the story of their effort over time, they're much more likely to continue along that road to that effort. And this is just another example of, again, using data organized in the visualization tool. This is just a Google sheet, by the way, that allows the end user to tell their own story, to use the data to tell whatever story they want. So this is a program where the organization is dealing with a lot of post-secondary students who they're coming in and working with them on future skills development. So the client is able to, let's say, just look at their last three cohorts, what was the program status and completion. We can see we've got 60% completion in there. We've got slightly more urban students than rural. We have 16% of those students identified as having a disability. So the organization, because we've organized the data in such a way that it's self-serve in order to help them use that data for whatever kind of story they need to tell to whatever member of their community they are engaging with. So the idea really being that when we create data visualizations or collections of data that we make it obviously filterable and manipulatable for the end user because every user will have a different way that they want to use that data to tell a story. So it's a really key first step when you're thinking about leveraging data for storytelling is recognize that the stories are infinite, the audiences are infinite. And so showing my data in a way that's easy for folks to be able to use how they need to use it is really critical exercise in user-centric design. One just additional way that you can use, so this is a specific example of ways that you can use data to tell a particular kind of provocative story is this sheet. What it shows is event costs for an event-based program. Okay, and this program has decided to track expenses related to the events that come from social enterprise vendors, okay? So every time that there's a purchase they use art in their event. Every time there's a purchase from an artist or an organization that's a social enterprise we track that it was from a social enterprise vendor. Similarly with event costs, marketing and promotion materials and supplies of printing. I did a lot of printing for this event and always use social enterprise printers. So we were able to then determine that 12% of our total event costs were spent with social enterprise businesses, okay? Now, this information was something that we collected because we were interested in. But what it ended up being really useful for was telling a story to the funders, many of whom were investing in social enterprises or had a social innovation focus. And we were able to say to them, hey, you know what, when you give us a dollar for our program, that money's going back into the community because we're prioritizing spending the money on social innovations. And look, we can show you our event budget data right here. We could pull it up in the meeting and show them we're really committed to this, okay? So we use this data to tell the story of our commitment beyond just the outcomes of the program but into the whole social innovation space writ large, okay? And so the data's been really provocative in that story because it doesn't really lie, especially financial data. Listen, there's all sorts of examples of people who fudge all sorts of financial data but that's not what we're talking about here. We're talking for the most part, the dollars and cents don't lie. And this effort to track and represent the data in this way indicated to the funders the intention and the value of the organization. So it conveyed a value proposition. It wasn't just, it wasn't really about the facts of the figures, like they didn't really care that it was 12%, that could have been, that number could have been 22%, that number could have been 8%. But what it demonstrated was that we have intention around this because we put effort behind it, okay? And that is what creates trust and that's what creates the conditions for additional support. So, you know, all that to say that in some ways data-driven storytelling really does inspire trust, okay? For better or for worse, quite frankly, because we know that data can be poorly manipulated or manipulated to nefarious ends. So for better or for worse, data-driven storytelling does tend to inspire trust, okay? But in order to get to that place of trust in using data to tell our stories, we have to know something about the audience for whom the data story is intended, okay? Before we can know how to inspire trust in them. In the example of that funder, again, we know, again, that they are invested in the social innovation space. We know that they support some social enterprises. So we know that they value social enterprises. So us showing them our commitment is that a tactical way to use the data to tell that kind of story and create that kind of connection. But we had to know those things about them as our audience in terms of them being our audience before we could make that leap into what kind of data-driven story to tell, okay? Listen, all stories have audiences. In fact, you could even say that a story doesn't really exist without an audience. Again, if I can recall back my literature at degree days, again, I had one fantastic professor named Ted Chamberlain who wrote a book called, If This Is Your Land, Where Are Your Stories? Which is a fantastic book about storytelling and land and place. But one of the things that Ted was an expert in was in the oral storytelling tradition. So the pre-written tradition of telling stories. And what he described was that we used to tell stories with the intention of them being remembered, right? Because if there's no mechanism to write these stories down, then you have to ensure that you're telling a story in a way that it's gonna be memorable, it's gonna be repeatable, and that the person who's listening to the story, even though they might put their own little spin on it, is gonna remember the core elements, okay? So we actually have this built-in wiring in our brain that is connected to story, whether that's telling it or listening to it. And that's our tradition of telling stories as humans is much older than our written tradition, right? So it's really deeply agreed in there. But again, in order to tell that story effectively and to tell it effectively to be remembered, you have to think about your audience. Who is the story for? Okay, so whether the story is fiction or it's text or it's data, you have to think about your audience. What do they care about? What information do they need to know? Who are they? What are their priorities? Okay, so these are the really the first things you need to think about when you're starting to think about using data to tell stories. A really good method. And those of you who are marketers on the session, this will not be unfamiliar to you. You will probably know this paradigm. So it's the no-field-do method, okay? So again, your data story isn't about the message that you send, but the one that's received, okay? So how are you ensuring that message that you send is received in the way that you want it to be? And that's an exercise in getting into the mind and into the values and interests of that audience identifying them. Just again, using this example, I just did the no-field-do. Okay, so who is this audience? Their private foundation with an interest in social innovation. The no section is articulating the point. Okay, point of the story. And the point of our story here is that our program is committed to supporting social enterprises. The field section is an effort to show the audience that you care about their priorities, okay? And so we did that by saying we track every dollar invested in social enterprises. Your support for our program is also support for social innovation. Okay, so that's showing them that we care that their priorities are our social innovation and social enterprises. And then the do section is how do we be clear about what that audience should do next? So the next is look at the data, review it right here and see where we spend our money, okay? And as a result, then feel confident in supporting our organization going forward. So now that you know your audience and who you're going to be telling this by your data-driven story too, now we need to get to know the data. And this is always a little bit trickier than getting to know the audience, I think, because it requires to some extent some skills that you might need to bring in some analytical thinking, some process-based thinking, some folks who are really good at categorizing things. If there's someone on your team who always orders stuff on their desk by small, medium, large, or who organizes their books in alphabetical order, might not organize by genre, or by color, that person who's always doing the organizing, they're probably a person who you really want to bring into this exercise of getting to know your data because they probably have a little bit of a process-based or organizing a tendency. So somebody might be familiar with the data information knowledge, wisdom pyramid, okay? So this is really important to consider as you think about getting to know your data. So at the bottom of our period of our pyramid is the data section, right? So this is received from the world more or less as a reflection of how the world works. It's a drastic oversimplification, but essentially it is to some extent a representation of some kind of truth. So for example, if I were to pull out a thermometer and measure the temperature in the room, I would have a measurement of the temperature, which would be my data point, that the temperature exists, regardless of whether I'm measuring it or not, my act of measuring it is what turns it into data, okay? So that data comes at us from all different places. If we choose to collect it. But in order for it to become information, this is where we have to apply some structure to it. If I just put the number 18.5 on a piece of paper, it would be just the number 18.5. But if I put a little bit of a C next to it, then oh, that's degree Celsius, okay? So I've applied some structure to it and now it's become information. So now you know what the temperature that number is referring to temperature, okay? The next step is, again, another human, I guess human intervention to turn that information into knowledge. So the human intervention is, okay, 18.5 degrees. I think I only need a light jacket, okay? So with that information, now I know what kind of jacket I need to wear, okay? And then again, the wisdom is again, another human intervention that applies another layer of knowing to knowledge, okay? Which would be the wisdom to know that generally this time of year, that's the kind of temperature, right? So that's the level of wisdom that we apply along this example. So you can't get to any kind of wisdom or knowledge, which we typically try to convey in storytelling without first knowing what your data is and applying structure to it so that it becomes information. Again, if we go back and think about our social enterprise spending budget, we had to apply that structure to our budgets, which is to say, just a column that says, is this a social enterprise friend or not? And if it was yes, then that was enough structure to turn that data, that budget data into information about our social enterprise spend. So then the question becomes, what logic should you be applying to your unstructured data? Again, the data has to have that structure to be used in storytelling, otherwise it's just floating around the bottom of the pyramid with no way to move up that pyramid, okay? So these are just some examples of structured versus unstructured data. Structured data, some examples, a date, a phone number, social security, a social insurance number, credit card numbers, names, addresses, transaction information. Usually this information is already structured because it has gone through a predetermined process by which structure is applied to it so that it conveys meaning or information. But unstructured data by contrast, it typically looks like sort of text files, reports, email messages, audio files, video files, images, surveillance. I'm not sure where this randomly chose, they randomly chose surveillance imagery to include in this table. But generally, so like in a report, there might be structured data, but the report itself is unstructured, okay? You can start to apply structure to any one of these unstructured data sources. And that's a process of human intervention. So for example, with your images, you might start to apply tags to those images. So now the images become a bit more structured because you can find them, you can group them, and then you can start to use those groups or those categories or those tags to tell different kinds of stories about those images, okay? But that effort of, for example, just tagging a series of photos means that you have to clarify what it is that's important to you about those photos, okay? So what would be the set of tags that I would wanna use on those images, okay? And that is, it sounds like a very small and pedantic exercise, but it's actually deeply related to your mission-related goals. The effort to apply structure to your data should flow from a well-established and defined set of mission-related goals, okay? So if you haven't done so, if you've never really gone through the exercise of defining what your organization's mission is, what you're working towards, what those outputs are that indicate whether or not you're meeting those goals, then this is an exercise that you really wanna do before you can start to dig into what kinds of data you can use to tell what kinds of stories. So this is often commonly referred to as a theory of change exercise for those of you wanting to get really good at using data to tell the story of your organization's effort or impact and who haven't done a theory of change or perhaps haven't looked at your theory of change in a long time, this is an essential first step, right? So it's the story flow, it's the hymn sheet, right? That indicates that we're all singing the same song, okay? And without the effort to define your theory of change, which is just really briefly, it's the big statement about why your organization exists and then it also includes what those outcomes are of your work, okay, so here we've got a program that is working with street-involved youth and their outcomes are that young men are self-reliant and employable, the young boys live in a stable environment and that less youth end up on the street. Then they know these are the activities that we do that contribute to these outcomes, okay? And so all of that is in, all of that theory of change and that flow from activities through outcomes into impact can be understood as effective when we start to look at the data that we gather as a matter of conducting those activities, okay? So this theory of change, this really nicely designed map and flow of activities through outcomes and into impact always then leads to a measurement and evaluation structure, okay? Which is what we have here. So here we have, we have some outcomes, we have an impact area. So for example, if an impact area is sustainable income, so we want our program participants to have sustainable income, then the indicator of whether or not they have sustainable income is if they have a job, right? And the way in which we understand if they have jobs is that we have a source for that data. So we have home visits, we have a home visit form and an aftercare notification, okay? So those are the data sources that contribute to understanding whether or not our program participants are getting job placements and as such having sustainable income and as such not becoming street involved. So both, but the effort to understand our data has to be done in the context of our larger organizational strategy and mission. So telling good stories through data means that we need to know what story that we're telling, okay? And that everybody needs to be on the same page and the theory of change is really that common story flow. Okay, so again, if you're looking to get better at telling good database stories, first you need to understand what is the change that you're hoping to have in the world, okay? So rapid theory of change exercises, there's all sorts of really great free workshops and content online that you can find that will help facilitate just a rapid theory of change exercise that you might wanna go through. So again, so knowing what your mission is and the type of data that you need to collect in order to contribute to understanding whether or not you're delivering on that mission is really critical. So as part of your effort to get there, you can start today by looking at what data you do have, okay? So knowing what kind of data you have, you do have is a good first step in understanding what stories you can begin to tell. So for example, did you have more than 75% participation in a program survey? This can be a really good sign of high engagement. Is your open rate on your email invites to the program really high? Also could be a good sign of good engagement. Do you have participants that have attended multiple events or programs? Also can be a sign of really high satisfaction, okay? We've got, oh, we have 30% of participants have attended more than three sessions, okay? That would be great. So what you're doing is you're looking for the countables in this data. So that you can begin to make inferences about what those figures mean. So by the countables, you're looking for for that kind of quantitative data. So anything that a number can be applied to. So imagine that you're gonna audit program data has a good place to start. So you'd wanna gather together any survey results. Maybe there's an exit survey and an entry survey. You wanna look at your email invites. And if you have that ability to look at who opens them, who clicked on them. You wanna look at event details. What were the events that were associated with the program? Where, what was the location? What was the time of them? What was the theme of those events? Oftentimes we don't do enough to categorize the themes of our content so that we can understand which themes are resonating over other ones. You wanna look, gather your registration forms. There's a lot of good information that you're usually counting the number, obviously the number of registrations that you have, but are you asking additional questions in those registration forms that could have some good data, some good information and relevant data there. Look at your list of facilitators. What do you know about them? All the data that you can know about your facilitators, demographics, where they live, what their areas of expertise are, how often they facilitate your participants, how many of them, who are they? Again, demographics, participation rates, lengths of engagement with your organization. Photos, look at, categorize them by the metadata that they already contain, which is usually a date and a time. If you upload a photo from your phone into a drive, it's automatically named with some metadata, which is usually date and time. Okay, and then can you go through the effort, as we mentioned, of tagging? And tagging is really key. Can you look at your budgets, right? What kinds of program-related budget data that do you have, right? So gathering it all together is really key. Eli, you're absolutely right. Tip, try to know what you wanna report on before you launch a program that's obviously essential, and that would be part of that theory of change exercise. Sometimes though, we lose sight of program reporting and find ourselves with a whole lot of unstructured information, or perhaps we have good information that's contributing into a funder report, and that's the only way we've thought about reporting on our program data, but in fact, what else can we surface? What else can we surface from that program-related data? That's not just about checking the box at the funder needs, but actually connected to our, again, our mission and our values and our output. Yeah. Okay, so once you've gathered all of that data, and now you're like, okay, so I've got all this information, what can I surface in here? So now this is the creative part of determining what kinds of questions you wanna ask of that information. Did participants benefit from the program, for example, which events were best attended? Did some facilitators resonate better than others? Have our program costs increased, decreased, or stayed the same? Why? So coming with a set of questions, you might call this the business analysis piece, coming with a set of questions to apply to that data when you put it together can be really helpful, because then it's this really cool investigative journey, because you're like, did some facilitators resonate better than others? Hmm, I wonder how we could figure that out. Okay, let's cross-reference the dates when we know facilitators ran sessions with the survey data for those particular dates. Oh, now we can see facilitator A, receive scores, eight out of 10 satisfaction, facilitator B is like six out of 10, and facilitator C is like five out of 10. So what do we need to do to support those facilitators to deliver the content in a way that's more successful? So coming with those interesting kinds of questions helps you understand more about what you can do in the future to perhaps improve on your program, or just tell the story of its success. So context, again, is really key when you're looking to analyze that data and surface some good stories. So make sure you're considering the full picture. As we mentioned, data can be used in all sorts of convenient ways, the way to make sure that you're not oversimplifying is consider context, right? So your registration data might look high, but your attendance data is low. Okay, did something happen that led to people not attending? Was there an email reminder sent? Was there, was the link broken? Or what perhaps happened that we got, had such a big drop-off? Another example, your survey data may indicate that some people didn't have a good experience with the program. Are there commonalities with their context that would explain this? Did they all participate on a day when there was a technical difficulty, for example? Make sure that you're looking at the full picture around that piece of data. So next, the next piece to take into account is to be aware of bias. So be aware of what you don't have information on or what's perhaps missing. So the data about those who didn't complete your program, for example, or weren't as successful or weren't as successful is as important as the inverse, as those who were successful. So the solution might not be in what's there, but what's missing. And this image, some of you might be familiar with this story. There was during World War II, there was researchers at the Center for Naval Analysis were trying to protect pilots from being shot down on their runs over Germany. And they were like, let's look at the planes that are coming back and let's see where they're sustaining damage and how we can better protect our pilots by improving the design of the armor on the planes or of the bombers themselves. But one very smart analyst said, okay, but you're only looking at the bullet holes on the planes that came back to base in a sense of the survivors. What about the ones that went down? So in fact, what they realized is that all of the holes on the bombers that came back to base indicated the strongest places on the plane. They indicated the places where the plane was most likely to be able to take fire and take damage and not go down. In fact, what they realized was that if they protected all of the places on the plane where there wasn't any damage that in fact they would ensure that more planes came back. And that's in fact, exactly what happened. The places that didn't have any damage were the fuselage, this, I don't know, I'm not an aerospace engineer by any means, but these were the vault, these were actually the most vulnerable spots in the plane. So that's what they ended up reinforcing. So you have to be aware of what you're not seeing in the data because what you're not seeing can also be where the really great stories can be. I love that story, a great story about bias. And then finally, you wanna categorize your qualitative data. Okay, so like I mentioned, you might have big text files or image files or audio files and you might not know how to analyze them, especially text-based content. For example, you might have anecdotes or texts given via interviews or open text boxes in surveys. So the tip would be read all those responses first and then start to group them into categories. You can even do this as a physical exercise, right? Print off all of those interviews or those text-based anecdotes, cut them up and start grouping them together. And then you can apply a category or a tag to those responses, which again, turns them from unstructured a whole bunch of text into something that's structured that can be found easily, that can be analyzed and be like, oh, we've got 80% of our comments fall into this category, 10% into this one and a 10% into that one, okay? And that effort can help you recognize patterns again of success or challenge or opportunity. Okay, that was a lot. Sorry, I feel like I was just racing through that to some extent, but we've got a few minutes left. Interested to know what kind of data do you have that you're hoping to tell stories about? What questions are you trying to answer with your data? Who is the audience for your data-driven stories? It's really key, remember that one. And then the final one to keep in mind is what structure do you need to apply to your data? Okay, so those are just some key reflection questions for you to take away. And now I'd like to hear from folks what questions make you have. Thank you, I did. That's really great. And I love the really practical approach of saying sometimes we just are gonna have unstructured data. How do we structurify it? Which I think is a really fabulous insight to say, oh, it's not lost. There are still real opportunities around this. I'm gonna have a moment for people to drop questions into the chat. Let us know what you're curious about, what some of your experiences have been. If you share a story of pain and tears. But let me start off with a big question, which is the longitudinal survey question. So I've often struggled to do some of that behavior change questions to say, they started here, then they did cool things with my program. And then at the end, they're probably doing a whole lot better. But what I often struggle with is once people are done with the program, their incentive for giving me the time to fill their survey is really reduced. So I get really low returns on those feedback requests a year later. And I'm wondering if you have any tips on how I might be able to more effectively structure these longitudinal kinds of questions so I can see behavior change over time. Yeah, that's a great question. And it's an issue I'm having with three of my clients right now who all work on programs where they are looking to see a behavior change as a result of the efforts. And that's difficult to measure. And disclaimer, I'm no measurement and evaluation expert. Behavioral change is very difficult to measure from a survey to begin with because people don't tend to analyze or reflect on themselves in a negative way. So if you say, you're never gonna say, oh yeah, I'm deeply prejudiced. And then at the end of this program, yeah, no, I'm not prejudiced anymore. You can only really count the effort that you've engaged with in. And then I think for more of those behavioral change or those longitudinal efforts, I think interviews are some of the best ways to do that. And I think you incentivize people over the longterm. A, by giving them a place again to track their progress. So whenever you can create opportunities for people to have their own little sort of sections of a platform where they can see a progress bar. Oh, you've done this many learning pieces. You've engaged in this many feedback sessions. People will tend to want to stay connected and continue to move along with the program. So I think that can be one effective way. And then obviously we don't have time to go into this and I'm sure there's content that you provide on this. But AI is starting to emerge in ways that can do some really powerful stuff around longitudinal behavior change. I know one organization I was working with that was working around violence in the workplace. And one thing that they played around with was a machine text analysis program that they basically installed over their email client. And it could read, it wasn't reading the emails for content. Like it wasn't interested in what was being said in the emails, but what it was able to surface was sentiment in emails. And they could see, they could then send folks like a little bit of a nudge saying, hey, you might want to think about using this kind of language. But like that language it used was really, it wasn't inclusive and you might want to think about using this instead. So really interesting application of technology to play that role of a little bit of a coach to help reinforce the learning. Yeah. That's really helpful. And I think, yeah, it's always a great reminder that there's this huge gap between people's expressed behavior and interest and what they actually do. What they actually do. And there could be a huge gap between those. Quick follow up question. If we were to go and take this survey approach, which is obviously higher lift, we're not going to touch as many people, but as you say, it might let us get a little bit deeper. Do you think I'm going to harm my results in any way if I would use things like a gift card to incentivize participation? I don't think so. No, I think any kind of incentive that you can provide to honor and respect people's time, quite frankly, obviously we see this in more community based fundraising and research to compensate your research subjects for their time is respectful and completely appropriate. Got a comment coming in here from Michael. Do you want to come on, Mike? Is that possible? Yeah, sure, thank you. Michael. Hi, I'm from Hong Kong and responding to Ellie's question about the self-administrated or self-report behavioral assessment, of course, them will be a problem of a return rate and overall underestimation. And in my experience, we built in multi-parties assessment tool for play therapy in our center. And what I mean by a multi-party that is, we observe the children's behavioral change and also we ask the parents, the mother and father, what they observe at home. And if possible, we'll also send a questionnaire to the teacher to see if the behavioral change is present in the classroom. And what I, my experience is that if the parents are concerned, especially when they are the referral, referring to kids for our service, they are more motivated to complete the questionnaire. Yeah, that's really good insight, Michael. Thank you for that. Yeah, as many people as you can have answering the same question from multiple different viewpoints is gonna create a much richer dataset. Yeah, thank you for that, Michael.