 it's common for organizations to kind of focus on the technology as the limiting factor like, oh, you know, if only we had a different X, we would be doing so well, you know. And I think this is one of the things that the data can help you prove out because I think one of the other pieces of that quality of engagement and the capacity of engagement is really the configuration of your systems and the people you have running them and the capacity they have to use those systems. And you know, most organizations, you know, don't necessarily have the worst tool for what they're trying to do, what they have is a bad setup for using that tool. And that setup really isn't just in the technology level. It's in the capacity level, the understanding level, the messaging level, the engagement level. And if we start to think about these as a taxonomy for data, we can start to sort of see where you're getting stuck in terms of which thing needs the most help. Like is it that, you know, we're only able to spend four hours a week in the school and we think to actually create value out of it would take 12 hours, now you're having a different conversation than if you're telling people like the email systems are working for us, right. And I think that that's one of the values of data that helps create illustrative points about, you know, how and why you need to change what you're doing. So if we move on, we will now talk a little bit about how data can help answer that question. And so, you know, modeling data is really important. And I think this is where the mental framework is going to come into play and be a little more concrete, which is that, you know, here are just three kinds of examples of things that you could measure, you know, the editorial process, like how long and how much effort does it take to create content versus the engagement that content creates. That kind of report in and of itself is extremely valuable for changing behavior. You know, I think almost every organization I've ever consulted with has at least one of these content types or one of these things you make every year that you're likely to spend three weeks making this thing and like 100 people click on it. And we don't even see those as the right 100 people. Those aren't even our influencers or funders that are clicking on it. And so if you can measure the effort to the result, that's really valuable. And so that's the report. Most people have pretty, you know, easy access to. You can mock this up. And the other thing I'd say is that for data, don't be so worried about it being completely accurate. You might be like, I don't want our team to fill out time cards. Not a problem. If you have a way to like, you know, even t-shirt size these efforts, that's really valuable. If you can say this is a large effort and this is a small effort, you're already starting to create data. So don't be too worried if you're like, oh, I'm not sure if I can justify this 50 minutes to run that. We don't need that kind of scale necessarily. The next time is, you know, what I call real-time production monitoring, which is to say these conversations, like we mentioned earlier, need to be kind of maintained and built and nurtured. And you need to have the right mix of content. If you're only writing content for people who've never met you, all the people who've met you aren't being serviced. If you're writing content simply for experts, you're not necessarily on-ramping other people and you're not necessarily helping people who know experts connect their network to your network. And so that's another piece of it. And so just knowing what you're making of what type, what the cadence is, and sort of letting people know what you can do is really helpful. And that's especially helpful when people want you to change what you're doing. You know, say at a pandemic moment, you know, if you have a sense of like what your current capacity and capabilities are, if people want you to shift things, you can now have more intelligent conversations about what the trade-offs you're going to be. Like, oh, well, if we do X, we're going to have to remove Y, you know, because I think there's often an expectation that somehow we can just throw some more stuff on the fire and it'll be great. And then this last thing, what I call content performance bicycle, it's another sort of jargony term, but it really means that some of your content is very ephemeral. You know, it has an impact for a day or two days. It's kind of just like a you know, it has a very short shelf life. Other content is very useful and evergreen. And, you know, it might be something like, you know, what's the five-year plan, or it might be a seminal research paper or whatnot. And I think that if you have a sort of content performance life cycle, you can also explain to people why it's useful to spend effort on old things and things you've already created versus new things. Because I think that's another challenge for communications, is that everything you communicate tomorrow isn't necessarily better than what you communicated yesterday. And I think being able to tie together the quality work you've done and the new work that you're doing is really valuable. But it's really hard to justify going back and refreshing and focusing on, you know, existing content and existing engagement. And so if you can help visualize for folks what the performance life cycle on average is for the kind of content creating and why you need to promote and use stuff you've already made, that kind of can help change your operations as well. So these are just three little vignette examples of how you can use data and think about data to have better conversation.