 Today's session is on leveraging data to explore the content lifecycle. We are going to be talking with Jamie Perez, Jamie, who is a what, just shy 20 year veteran in the mission-driven sector, really a pioneer in a lot of good practice in digital outreach and currently at the Center for American Progress doing some incredible work for the progressive movement. So thank you very much, Jamie, for joining us and looking forward to the discussion. As we go, we're going to talk a little bit about a new program that Parsons TKO is launching and we're very excited to share with our community, the Data Innovation Studio. So we're going to give you guys a little detail about that and Jamie as one of the members of this new program is going to showcase a little bit what we're striving for and the type of experience we're trying to create for our membership. So excited to talk through all of that. First things first, Parsons TKO, I know we have a couple new names and new faces joining us today. Parsons TKO is a consultancy, we focus on the mission-driven sector and provide services to communications teams, development teams, executive offices and really helping those teams work together to successfully engage their audiences through a mix of consulting, road mapping work, focusing on designing technical platforms and thinking about the data strategy that leverages all of those pieces. The core philosophy of Parsons TKO and sort of the methodology that we've organized or company around is called engagement architecture and that helps us think about organization strategies, the people within and externally, the processes that they use in order to drive audience engagement and last but certainly not least the platforms, the actual technology that organizations use to create experiences and engagement with their audiences. I am the Chief Analytics Officer for Parsons TKO and so my focus in particular is that last column, the data that flows through all of that work and through that architecture and how we can use that data and feed it back into our work to improve outcomes. So that's a little bit about Parsons TKO. I'm going to hand it off to my colleague, Mickey, and just a few moments to talk about the data innovation studio. But I just want to sort of as a way of segue talk about the evolution of Parsons TKO as a company. We do this consulting work. We have worked in the mission-driven space for a long time. A lot of our work is focused on projects, single point in time. How do we move the ball and create a new capability? And a lot of that work, there's room for creativity. But there's also a lot of space in between. There are the relationships that we maintain with our clients, the things that we explore, all the projects that we never got to do for whatever reasons. And so a big point of this studio was to create a space where we can have those longer term relationships. And people can really feel like they are a part of a community that is being creative and coming up with new use cases for data, coming up with new ways that their organizations can create new audience experiences using that data. So the purpose of this studio is to be that space and a place for creativity and collaboration. And so with that, I will hand it off to Mickey. Mickey. Thank you so much, Devin. So hello. I'm Mickey. I've worked at Farsons TKO for the Data Innovation Studio, and we're so excited to share with you all about the Data Innovation Studio, which is PTKO's newest research arm. So to dive in a little bit more about what the studio actually is, and so what the studio, our goal is to perform independent research to provide the nonprofit sector with resources to leverage data. And the way that we do this is through an ongoing cycle of studies that all have a broad range of topics and are tailored to the interest of our members. So we'll move through these studies with collaboration and discussion from our member community, and the final output of the studies will vary from reports, frameworks, and other outputs that our members can then use to support and guide their day to work in super tangible ways in their organization. So here's a little glimpse into the studies that we have planned. These studies cover a robust variety of topics that range from brand analytics to using data to support DEI efforts, as you can see. So all these study topics are designed to ask difficult questions of the mission driven sector, and also to encourage our members to think critically about the state of data in both the sector and their respective organizations. So at the same time, this timeline is super flexible and subject to change. We expect our members to bring new ideas and future studies will be selected based on that input and feedback. Similarly, the length of the studies will also be affected by this and can range anywhere from one month to three months based on this member solicitation and feedback. Each study will also include concrete milestones to track the progress and keep members updated on exclusive reports and other studio outputs such as templates and frameworks that we'll be producing. And we'll actually be showing you a report from our first study in just a little bit to give you some more information about the study that's currently underway. We're examining the role of data in defining and measuring content impact throughout the study. We're analyzing the steps of the constant life cycle and their interdependence to put into a report that we'll be able to share with our members. So to go into more detail about how the studio membership works and what it looks like to be a member, I'll talk a little bit about how our members will be interacting with the studio. So our first firstly, our members participate in dialogues with like minded thought leaders and peers with guidance from P2TO. Our members will receive access to exclusive top level summaries from diverse areas of study topics that we discussed in the previous slide as well as access to additional resources. And these resources can include methodology and training sessions that will be able to help members and their organizations turn ideas into action steps and communicate to their teams what they learn from the studio as well as receive personalized insights and learn about data exploration tools. Members contribute to a growing community of mission-driven leaders through moderated events and also more casual conversations all around improving engagement strategy. As our members talk with each other and learn from one another, interesting ideas and questions will pop up that will have the opportunity to dive into as a community and explore further through our studies. So being a member is truly about engaging in conversations and learning from peers to bring in new ideas and resources all to impact the mission-driven sector. The first report that the Data Innovation Studio will be publishing is about the website engagement portion of Content Analytics. And to give you a sneak peek into these types of conversations in action, I'll pass it back to Stephan who will lead us in conversation with one of our studio members, Jamie Perez, to talk more in depth about the content study and discuss content impact. Thank you very much, Miki. Really appreciate it. So yes, Jamie Perez. I think you are known to many, but I think there's probably a couple of people who could use a little bit more background. So I think first things first, I'm going to ask you a little bit about your work, where you are in your career, sort of some of the things that you think about. But this is a conversation that you and I have had many times before. What is content, especially in the mission-driven space? We put so much effort into it. We do so much of our good, so much of the mission depends on content being able to change people's minds, change people's perspectives, their behaviors. It's such an important tool in the nonprofit toolkit. And so I think we're excited to get into some of that. Some of the questions I'm going to explore with you, what does content mean to you and your organization? Sort of thinking about the decisions that you make, tactically, how do you work on this content? But I think more importantly, to a lot of the gaps in the sector, what are the outcomes of all that effort? How do we determine the impact of that time that we spend? So I'd love to hear about that. But first things first, tell us a little bit about yourself. Where are you and what are you up to? Great, great. I am in the office, which I've only done one other time, no, two other times since last March. So apologies for the March 2020 update on my board behind me. Unfortunately, a lot of it's still relevant. Let's see. So I'm over at Center for American Progress. I've been here a couple of years now. As you noted, I spent a lot of time in the social good space ahead of this, starting in digital agencies back as they were getting started out way back when. And actually, Robert Wood Johnson was one of my first clients back when we were still in the other century before Y2K, which I spent at a digital agency, which was a lot of fun. But so social good work has always been a part of my portfolio, maybe because I live in Washington, DC, or did at the time. But over the years, it became an increasing part and spent the majority of my agency career at a small shop, still around called 3Spot, where we actually took a kind of full service organization and focused it just on social good work, which at the time, 2006, 7, wasn't normal. It's a lot more normal now. It's good to see the world go in that direction. But after all those many years doing agency work, I was always the guy working with a client on strategy. It did early work with folks like the Brookings Institution, and in that, that was right around the time CAP was being born. And we all had eyes on CAP because they were doing some really interesting things the way they blended investments in communications with investments in policy development. And it was just a very different way of looking at it, at how change happens. And I think if you had talked to any communications person at any think tank at that time, they would have said that is how communications happened. And I wish our investments were aligned like that. So I'm sure I wasn't the only one holding up CAP as a case study for internal conversations. But as time proceeded, for me, I saw going in-house as the opportunity to stick with the strategy for the long haul. And I think that's particularly relevant to this discussion, but probably to so many because you do so much work helping people understand what should be done. And then you hand it off, and you hope you're making something that they can live with. But you get loyal to those strategies, and you want to be part of the living with. And so about four years ago, I shifted over to going in-house and resulted in me coming here about two years ago. I'm a big think tank guy. I started at a nonprofit before Nuclear Arms Control, which with my degree in English literature and experimental poetry was an obvious fit. But there was a recession when I got out of school too. And it just kind of led me here. And it opened me up to a whole world of think tanks and have made them a huge part of my life to the point when CAP released the job, three different people sent it to me and was like, oh, you've got to apply for this job. And they were correct. So this is, I love being here. I love working with the talent we have here. And that's what I'm focused on, is how do we get that talent out there in the world in the best ways possible so that our great ideas really happen? No, I think that's kind of where this takes us into the discussion. Very good. And actually, I'm going to pull down the slide there so we can speak face to face. There's a couple of things you tapped into there that have really resonated. I think one in particular we see so much in the nonprofit sector is people apologizing for their English majors. People apologizing for their background as a journalist. There's so many odd backgrounds in the sector. And I think that's a real truth for us is nobody was trained. It's rare that anyone was trained to do the work that they do. And yet we see, I think, especially things like English as a key skill and creative writing, things like that, as a key skill because of the importance of communications, of messaging. A lot of the work that we do is changing people's minds, getting them to do things. So I'd love first just to explore that concept. What is content? You sit atop a vast digital machine. But that machine is doing a lot of the same channels and goals, audience engagement goals, that a one person comms team might have to do. So what's in that portfolio? What does it look like to work on content? Oh, wow. OK, so here we have a lot of interesting structures. And we're actually doing a lot of work on some of those structures right now. So we'll talk about where the structures are today or yesterday, if you will. But that's part of it, too, how important it is to always be working on those structures because new things come along, new opportunities come along, things like the coronavirus come along. It's amazing. Sorry, I've been working on taxonomy a lot. And it's amazing when I had to give examples of, well, taxonomy is a living thing. Because big things will happen that'll change at all. It was shocking how many examples I had just from the past five years. Anyway, so anyway, that's the way it goes. The machine here, though, we have the first thing that's important to note is that we have both a C3 and a C4. And so I spend the majority of my time working on CAPP, Center for American Progress, where CAPP work, but I do spend some time doing some CAPP action work to our C4 arm. So we have both those things going. And with both those things, we have a variety of channels for both name your favorite social media platforms. We have those in C3 and C4 flavors, websites, email, products of all kinds. So there's all that. We put out a lot of video. That's a mix. Everybody should struggle with how much video you should do. That should be an ongoing struggle in your head if you do content at all times. Like, does it do anything? Does it not? Always depends on audience, of course. But back in the day working with Brookings, some of the audience strategy work I did back then in audience research, I used to this day, which is when everybody goes back to congressional offices. Just remember, every congressional office has two to three TVs on at all times. Like, that's what your video is competing with. Like, competing with, you know, like a billion dollars of news production, which is that far back. So that's how the, and we also have a production staff that includes an editorial staff, an editorial staff that gives ongoing writing instruction, like writing workshops. It's hidden secret, open secret. I don't know. We have a Pulitzer winner on our editorial staff and he never mentions it. Just note, it gets mentioned all the time, he never mentions it. People around him always mention it, which is pretty great. Actually, his, Carl's work is incredible. And the work he won his Pulitzer for is sadly still very pertinent as it was about race relations in America, which is what it is. Anyway, so we have all that kind of content. We also have events, live events that include onsite, you know, that include just folks within our fold or folks across the progressive movement or elected officials. We do private events as well. That's a, you know, I think it's important to have kind of closed spaces to talk about ideas when you do the kind of work we do from time to time because it's sometimes unpopular. I come from nuclear disarmament originally, right? Like sometimes the unpopular ideas need to be discussed from a policy standpoint. That doesn't mean anybody wants to put them in place. So that's the content package. A lot of the content development happens within our policy teams. We have 20 policy teams. And so a lot of the content production is being done there and a lot of the work in the center is about helping them, getting them the tools they need and helping them use them the best they can. You know, there's something you alluded to before and it was about the role of intuition that goes into the work that we're doing. You know, some of the questions that we have to ask ourselves, like, is this the right thing? I kind of guess so. We've always done it. There's a good deal of institutional momentum in a lot of content creation behaviors, practices. You know, as we think about those decisions, a lot of those decisions are the target for this data. You know, we are looking at the data to understand what we do, why we do it and try to change the outcome, and which means we need to change the inputs. But with that in mind, we need to know what to judge these behaviors by. And I think that's what comes down to impact. So I think with, you know, I'll put several things in your mind and just unpack them as you will. You know, you also mentioned the audience sensitivity to our outreach. How do you change what you do based on who the audience is? And when you think about the relationship between a piece of content and an audience, like you were giving that content to a person because why? You know, what is the goal? What is the audience engagement goal? And then so based on those, I'd love to hear, you know, choose your favorite, but what are some examples of how you think about the purpose of your content and then what you wish you knew about that relationship, that content audience relationship? What do you wish you knew that might change your decisions? Yeah, there's so much I wish I knew. And you and I talked about this earlier when we first started talking about the content studio. You were very focused on the production life cycle and you still are. It's very much in the words right now, but they're not in my words. And you know, that's when early on we talked about like, I don't know if this is right for me because I don't think that's the right question. And so we like dug into it and dug into the fact that we don't know if it's the right question because how are you optimizing a life cycle when you don't have the measures yet? We have so many vanity metrics in the work we do when it comes to content. In a former life, when I was doing commercial, more commercial work or doing a smattering of commercial work, I used to do a lot of brand awareness work, right? So like launching a, we're rebranding a toy company that has 98% of the playground set market and we got to see if that name gets picked up, right? Like that's real situation, right? So I would do a lot of that. And then one day we were working for Visa like the mothership Visa. And they were like, yeah, brand awareness isn't our goal. Like 98% of America knows who Visa is. This message is our goal. We want to know message retention. It's a very different metric. And at each stage from brand awareness to message retention or message understanding even, you start to exponentiate the cost of the analytics. And that, you know, when you look at things like e-commerce, you have abandoned cart metrics and a whole industry has been birthed around that. When you look at like targeting your ads, look alike marketing on Facebook, that's incredible. And used to be only available to people with tons of money and research in the right ad companies and ad targeting companies. Now it's just like it's free from them as long as you stay on their platform. I know free is not free. We don't have that for content. What we have for content is a World Bank study that's like a hundred years old now that everybody that downloads PDFs never opens them, right? That's the research we have on content right now. And things like that. And when you look at the diversity of work we do you know, when you're trying to create policy, when you're trying to create change, not even policy change. Policy change is one kind of change we're trying to create on a regular basis. But there's a whole life cycle to that, right? Like an issue I really care about and I'm not a Yang, Andrew Yang supporter just for the record. Like it's just, and as an organization we don't endorse candidates. So even when I'm in my cap fold I won't say, you know, we don't endorse people. We endorse ideas and we push for change. But UBI is something that matters to me, universal basic income because I'm a big like far future sci-fi dork. But like that's nowhere. UBI is nowhere. I've heard like two or three stories on it my whole life. So like you have this first stage where you have to make people know the issue even exists. One of my colleagues, Banga, he's in the government now. We can talk about that kind of impact later. Banga's work, while he was with us, he's an economist and he would always take the jobs data and disaggregate it to populations to show that the jobs numbers getting better still means that black men are this far behind the average and black women are even further. So he would always disaggregate the metrics to help people see that no, there's actually a problem. Yeah, unemployment in general is getting better, but look, there's still problems here to work on. So that's like kind of identifying the problem and getting it done. But then there's a whole life cycle between that and something actually being done, like as far as framing debates or if it's gonna be policy like writing policy doing legal defense. And then even once it's passed, let's say it's ACA, like then you got to implement it, right? And so like you look at that whole spectrum, there's so many different people to work on. So many different people to get ideas to. And each one of those stages needs different tools for different audiences and therefore different measures. Some of those stages, we need to worry about five or six people. We don't need to do message retention. It's like we talked to them and they literally said this is a great idea, we're gonna go with it. Like that's the metric check done. But when we're trying to put a problem on the agenda, when we look at Banga's work on getting people to understand disaggregated data about employment or how measuring GDP is a joke, those take very different metrics. Like how are we at our scale with our costs and our budgets going to measure that we're occupying brain space for an audience and do that across 20 teams and accommodate the fact the new cycle bounces up and down. I mean, it's not as up and down as it was during the Trump era, but like it's still up and down. So we just can't afford that. And where the studio discussions got interesting for me is the idea of peers that care about this kind of stuff coming together and saying what they do have and trying to get to real measures. So we can understand the value of a report versus a fact sheet versus an op-ed versus a radio appearance. Those are all in the toolkit. Like how do you pick which one that I almost cussed and I'm gonna try not to swear during this and I just caught myself, which is not something I do often. I always say to people that get into this kind of work into digital content work or digital marketing, anybody on this team here is that there's never an end of what you can do. There's always more you can do. There's always more promotion. There's always more distribution. There's always more bites of the apple. But how do you decide when you're done? Like, how do you measure a law of diminishing returns if you don't have measures? And page views are great and I put them on reports all the time, you know, but it's not the same as brain space. We had a piece just the other day. We have a few, any think tank has awesome long tail content stories in their pocket. So I have a hundred long tail analogs of this, but we just put out, you know, we've been working on this COVID thing a little unlike anybody, right? But we just put out a big policy recommendation that we believe that the government should make vaccination a mandate, like make it a requirement for Medicare and Medicaid funding, you know, and I'm misquoting that piece horribly. So go take a look at it. I'm not a healthcare policy expert. I just work with those people. But it was tying needs to needs, public health needs to healthcare needs. And it was a great piece. And on its first day of Saturday, normally our lowest or second lowest traffic day in a week, it was our highest traffic day of the week. And that was because of that piece that had been published at 2 p.m. on a Friday, which don't talk to me about the wisdom of that. But it did, boom, right? But that day one, 75% of the traffic was from right-wing sites sending people at us to make them angry. And so tell me about pay's views and brain space and what we're doing. That's a complicated thing, you know? So yeah. And I mean, there's a whole dimension of this. Yeah, it's again, it depends which audience. So having that dimension and also the point of the stages in the audience journey. Where are they? Is it about awareness? Is it about interest? Is it about all the other acronyms? So we'll do just the opposite real quick, real quick. On the activist side, in our cap action work, we do polling to figure out what messages resonate, what outcomes, things like that. Someone's knocking on my door because we're moving out of this office. I'm in a webinar, hopefully they won't open the door. We do polling to figure out what messages we want and then we wanna disseminate those. And that's an easier measurement because we've done the polling research, we know this message matters and then we disseminate it. Very different than doing content work, right? Which is much broader. With that kind of work, we're able to look and we're able to see, you know what? Video isn't working on Facebook like it used to. Period, done. Easy to know. Okay, we're gonna switch our resources to doing fewer videos and more GIF animations of charts and flat charts and things like that and compare the metrics and we're gonna look at different sharers, like based on the sharer, what kind of pickup do we get? So those kinds of measurements are much easier but they required having that polling and they required having a much more identified point of a scope than some of the other work we do. And a lot of those, I mean, yeah, I mean, we encounter polling a lot. I mean, another besides the cost is also the speed of polling, you know, the polling is really great at telling you what the truth was a month ago or more, you know, depending on the time and the turnaround and then the insights, you know, are however deep in the, you know, the PDF that they send you. Yeah, so there's a lot of work to unpack from that. You know, I think you, yeah, sorry, go ahead. Just before you jump off that, I think that like a content studio discussion will be, hey, how are you using paid media and paid media results to do rapid polling? You do something like polling. And passive data sets as well, like search data analysis, like social listening. Yeah. You know, there's, a lot of people are putting their opinions out there in the world even before you ask them but it's just a question of knowing where to look. You know, you, I really liked your point about page views. You know, page views, it's the first metric that you see when you open anything that has numbers in it and, you know, we will use it, we will report it but we don't feel comfortable making decisions on it. You know, you're not going to cancel op-eds as an organization because of page views. It just doesn't tell the story of the purpose of that document. You know, it could be one page view that matters if it's, you know, from the right congressional office. And so it's just not, it's not the metric. It's a, and certainly not by itself. So I think let's talk about what it takes to get the right metrics. And I'll ask you this question but I'm also going to ask the whole group this question. I'd love to see people in the chat talk about their own experiences here but what is keeping us from having the right metrics? How much is it the technology parts of it? You know, putting the right measurement tools in place, knowing the question we're going to ask before we ask it and, you know, engineering our data collection to capture that data. And how much of it is it, I just don't know which ones to focus on. You know, I don't know, you know, the strategic underpinning of these metrics. You know, what do they represent? I can't decide which ones to focus on. Where is the biggest challenge there, do you think? Yeah, I think the biggest challenge is you can't turn off the analytics to do while you figure out the right analytics, right? So you got to keep on reporting those page views. You got to keep on reporting that busted time on site metric or, or which is a whole lot better sounding metric than page views in the first place, but, and then you have to, you have to train up the whole staff on what the right metrics are, how you're going to take them, how you're going to figure the reports in ways that don't just make them another version of page views. Right, because I don't know if you've seen that, where you're like, oh, I think this, I think this is a better measure of engagement, this measure. And you put it in place and they're thinking of gross numbers and enterprise wide numbers. And it's like, no, that's useless to me. I'm trying to figure out like which pieces are doing better than others and why. And so you have to always make sure you're doing that conversational work of the why and having kind of layman's statements of goals and intent and then figure out how you're going to align the metrics to them and fail a bunch at the experimentation, right? You might think social media sharing is a huge indicator of engagement, right? Like, sure. And then you read the study like how most people share things they never read and then you look at your web metrics and you see nobody ever shares it with your buttons so you can't measure it that way. And you can go on down the line with problems the same way you have all your problems with say time on site, right? So we're trying to get there. We're trying to see our ways through those metrics. Another one, you might want engagement to be like page views per visit, right? Like, oh, that sounds amazing. Like I tell you what, like what I want every user to do is like, I want every user to think, hey, what are the gun violence deaths by state in 2019? Google it, get it from me, download it and go to work. I don't want them on my website. I'm not a media site. Like I want them out there in the policy landscape getting shit done. And I can look at my page metrics. It's like between 1.2 and 1.3 since 2008 and it doesn't change, you know? Like a huge changes from like 2.3 to 2.4, you know? Like, I'm supposed to care about that. I can't. And instead I have a focus on repeat viewers, right? But my users have like how many profiles, how many devices and how often do they change jobs? Like all those numbers, the answer is a lot. So, you know, I have a mess there too. So I'm doing a real like kind of fuzzy logic kind of CRM thing. Okay, do that at scale as a nonprofit and go convince some donor somewhere or some foundation that that's really, we need to invest in like making the nonprofit version of like e-commerce create analytics. Like that's maybe that is what I'm gonna do over the next five years or try to, but like that's what we need. Yeah. And I think that's exactly, that's a great point. And one of the big motivations for the studio is the fact that we can't just wait for these better metrics to come to us because there is no industry above us that's motivated to do this. E-commerce, plenty. There's plenty out there. There's lots of people cracking that code because there's lots of money in it. But when it comes to thought leadership, when it comes to research, we can turn to academia, but they don't have the need, they don't have the drive. They are not as mission driven as a lot of these nonprofits that are on the cutting edge of mind share. You'll see some organizations out there and you can research them yourself and find them that have really tuned up, say like analytics plus publishing packages for WordPress because so many people aren't WordPress. But when you dig in on those, those are for media organizations. And that's like stickiness doesn't matter to me. Like I don't, you know? So subscriptions don't matter to me. So even those tools, which are really amazing, whether they're all in packages or analytics packages, they're very seductive, but then when you look at their strategic intent at the core of what strategic intent they're delivering to, it's importantly a skew. I mean, shit, look at our, sorry, I'm gonna allow the S-bomb in this conversation. When I stopped myself, it was the F-bomb. I just want you to know there's different degrees. Our sensors can't keep up. Look at HR at CAP, right? Retention, such an awesome thing. 59 CAP staffers or former CAP staffers are in the Biden administration right now. Do you think we'd be moan a single one of them? We have a party on the, we should have a party on the way out, but everybody's home. Like I used to work, I can name all kinds of people I have worked with for two years or a year and a half that are working on incredible progressive ideas that I really believe in, many of which you see in action right now, the childhood tax credit, the child tax credit, which just quoting other sources, a race, it can erase half of child poverty, just the day it goes into an implementation. Go look at what happened in the UK when they did this. But I worked with someone here that was working on that for a year and a half, and then they went into the administration in a relevant position where they're able to talk to people that are actually writing the pieces of paper or are writing the piece of paper to make that kind of thing happen. So that's part of our impact model. So like our content has the same dipping to do as that does. You just see it all over think tanks. We're just such an odd thing that we have to band together to figure these stuff out because all this stuff, let me just say it out front, like all this stuff should not be a competitive advantage among think tanks. This should be known shit. So donors get more out of their money, foundations get more out of their money or whatever our funding models in the future are get more out of their money because we shouldn't be guessing at this stuff. Amen. Very, very good points. Thank you very much, Jamie. I'm gonna turn it over to Chelsea in just a second, but Jamie, there was one point in there that you made that I really, really liked and it's just, it's a good takeaway, which is the real world experience gut check on our metrics. How do you know if your metric is doing it right? You think about the actual scenario and you think about what a person is actually doing when they engage with your content and whether your metric tells that story. And so the example of somebody comes to your site, they download it, they go off, they use it. They do things with it in the real world. That's a balance that is a one-page visit. It is, it's everything that your metrics, intuition and best practices that came from e-commerce world tell you is wrong and it's exactly what we're trying to accomplish. And so I think bringing that kind of thoughtfulness and sensitivity into our analytics discussions is that's a point of the studio. So thank you very much, Jamie, for sort of showing us what it's like. This is a studio discussion, so thank you. I'm gonna hand it over to Chelsea, who is an analyst in the data studio. And I'd like to think of as a curator in the studio sort of helping us tease out some of the best insights and stories from all our conversations with organizations. And she's gonna talk to us a little bit more about the content study. That is the sort of the focal point for our conversations with Jamie and others like it. So Chelsea. Awesome. Thank you so much for that conversation, Jamie and Stefan. It was super, super interesting, even for me working on the study. So for this section, as we wind down, I'm gonna go into a bit of the nitty gritty of what the content study actually is with the data innovation studio and just talk a bit about what our big plans are as well as one of our first milestone reports, as Mickey has mentioned, in sort of the introduction into the data innovation studio. So our 2021 study right now is looking into the role of content in nonprofits. And really to complete the study, we are looking into creating a framework that we're referencing as the content lifecycle map. So to this graphic to your right, you can see all the parts of the content lifecycle sort of at its highest level from strategizing, creating, promoting, measuring and refining. Probably for most people here, if not everyone, you've probably seen something like this before. This really goes into a very broad level of all the different ways you can work with content and all the different roles that might go into a piece of content from the strategizing, to promoting, the refining and so forth. However, what we're trying to do at the data innovation studio is that we're creating a roadmap that actually extends this to go beyond sort of like, these are the day-to-day sort of tasks of what we're doing with content to sort of understanding the wealth of knowledge and experiences that go into the content lifecycle and go into content data that goes beyond just sort of that day-to-day. So what we're doing as we've already done with Jamie in that brief conversation is we're going into the stories, we're going into those lived experiences that are involved with each step because strategizing is more than just saying, we're going as a team, we're gonna sit down and strategize, right? There are sort of experiences and gut reactions in each of these processes and that do go into sort of like how we implement data into these processes that we might not discuss. So our prerogative at the data innovation studio is to transform this map, these experiences, these stories into outputs that can be beneficial, that can be useful for real practice implementation. So we're trying to allow any member of your team of your organization to make the most sense of the various variables and dimensions of this lifecycle that are unique to your work. So that's a lot of words. That's a lot for one study. So how do we utilize this content map? I'm gonna go into a bit of a walkthrough of what we've been working on and how this milestone report is actually going to look like next. So turning research into insights, let's start at this image on the left. This is a sort of illustration of our current ideas for our content map. So what is going on here? We have lots of squiggly lines, we have a lot of colors and what's actually happening is what is to your right. I think I said right earlier, I meant left. So what is happening to our right with this triangle? So what we're trying to understand is each of the phases as mentioned, strategizing, creating, promoting, measuring and refining. We're trying to understand the different data types that exists within each of these stages and then connecting those data types to resources and projects that your organization can actually implement and understand in your work. So we're taking these stories and we're trying to understand the organization of these stories. We're trying to understand the resources, the data types, the projects, everything that goes into one of those stories and creating a repository of really useful knowledge and experiences to share. Now from there, we can actually, yeah, from there we can, okay. From there, we can actually understand how a data type can correspond to certain actions that you may want your organization to focus in on. So as mentioned, our first milestone report we're publishing is related to website engagement data. So this is just an example related to one of the data types related to website engagement. So we can see in this image to your right that there are certain actions related to website engagement that can be like you wanna see form submissions, you wanna see downloads. There are so many variables that go into whether a user is actually going to take that action, whether it's the context. So their entry point into the website, the organization of the website itself, the user experience of the website and those little beige dots you see on the edge are all the different ways in which, the different pathways in which a user might end up with that action. So this is just one of the sort of visualizations of the sort of logic of the methodology we're using to understand that big messy content map, that colorful content map that we had on the previous slide. Now we're trying to really take those big ideas that repository of knowledge and then create these stories, these pathways that are more relevant to understanding how the data is flowing on your website. So a bit more about our milestone report. Our first report is titled, Web Analytics Are Tired and True. And it's trying to understand website engagement data and we're presenting you kind of the directions to navigate this map. So taking that big repository information and then giving you the directions. So from PTKOs experience and sort of from talking to members and talking to people in our community, we understand that a map is really only useful if you have the directions to go alongside the map. And that's what these milestone outputs are for. They're ensuring that transparency but they're also providing these detailed personalized recommendations to how you can best use all the information we're collecting. So this report is going to include types of engagement data, research about the impact of content, interview and case studies from peers and our data innovation studio members, as well as role specific guidance on how to use the content data. So that's where the directions are coming in. And from all these things included in the report, I know us at the studio were really interested in these interviews and case studies. That is really the bulk of our research in sort of collaborating and gathering information from our members to create things that are relevant and useful. So there are so many different pathways and directions to understanding content in your organization and our content map lifecycle framework helps the organizations figure out sort of the different roles in your organization and how they can support content impact even if they're not the most obvious person involved with data. So our map is trying to provide specific guidance on how to use content data and providing clarity to your team. We wanna create framework that can inspire discussion that can inspire sort of as Jamie and Stefan mentioned sort of the why we are collecting what we are collecting and then create practical steps to show clarity in that pathway of where we can go next with our content data. So our data animation studio is really in the process right now collecting this information, working out the study and building out this framework. And there is so much room for so many more conversations to be had about sort of what we can add to the study, what direction we can go in, what milestone outputs are actually useful to the mission-driven sector. So with that, I'm gonna actually pass it back to Mickey to round us off for today's webinar. Thanks so much Chelsea for that walkthrough. That was super great. Before we wrap up and move on to Q&A we just wanted to pose some questions to you all based around the kind of the central theme of out of the five stages that Chelsea talked about in this content lifecycle map which one poses the biggest challenge for your organization. So moving through these questions we have where in the lifecycle might you do something differently? Whose responsibilities and what tasks are involved? What specific decisions would be made differently? And what data would you like to have at your disposal to guide them? So moving through these questions can help to identify potential problem areas and opportunities for improvement to increase the performance of your content. And these exact questions are the type that we've worked with our studio members to solve as a community to discuss, build answers and bring solutions back to your team and organizations. Great, thank you very much Chelsea and Mickey, thanks for walking us through that. That is the end of our planned content. So I'd like to open it up to some Q&A and sort of talk about and field any questions, questions about the conversation. I know Jamie might have to jump right at the end of the hour. I think some of the PTKO team can linger a little bit. Jamie, thank you so much for taking the time to join us today and sort of share your perspectives on all this and of course beyond that for being a part of the studio. I think there's one question already. I did want to speak to thank you very much Emily. How does a small team effectively utilize and put into place what we're learning in the studio? This is, so the reports are going to be a very sort of tactable, tangible example. We are going to try to distill from all the conversations that we're having, really practically implementable ideas. Here's a list of metrics and here's the logic that you need to decide which of those are relevant and worth your time to pay attention to, put in place, whatever needs to be done. So I think those are some of the nuts and bolts things but I really cannot emphasize enough the value of the conversations that we're having in the studio. Having a space, we truly want this to feel like a studio. You can book time with us at any point and just come in and talk through what you're facing, talk through the kinds of decisions you're trying to make and really let's work together, think aloud and hear from our team what we've been hearing and really just doing that collaboratively with us as a studio team at Parsons TKO and then also member to member. Those member to member conversations are really important to us as well. So creating a space for those conversations to happen is a big part of what we can provide. So I'd love to hear any other questions from the group? Anything for Jamie first in particular before he pumpkins? If you come up with questions after I pumpkin, I do have to leave at the hour. Feel free to, Stefan, you can collect them and send them to me and we'll find a way to loop them back out. Absolutely. Over time. Anything else from the community here? This was fantastic and Jamie, it was wonderful to hear from you as a former think tanker. It's always that question of is the research reaching the people who are then gonna make the changes and then so it was really interesting to hear your thoughts on that and thank you for sharing that. You're welcome. Kelsey, it's wonderful to see you both. It's great to hear about that. I think one thing I would add to the whole process that everything tanker needs to fight the battle of is how do we get the comms to be part of the formation of the work and not just the distribution of the work. And yeah, we'll be fighting that all our lives. I think that's so important and it's also how to put like in my think tank, how do you put the comms kind of at the forefront and not at the back front of I've got this amazing PDF file that I wanna get out, put it out. But to bring comms into the forethought of well, what are you gonna produce to whom, for whom, why? And let's figure out what format that should be. And I think that's so important and it's a real challenge in think tanks because you've got all the academics who just wanna write their 10 page report and put it in PDF and be done. Yeah, you know what I think matters is finding the lifers like so it might be your leadership or whatever because it's easy to, when I say easy, it's not easy but one can write a grant and do it and execute it and get the page views or whatever you said you were gonna get. But when you talk to the lifers you can show them that look at how the metrics changed in the course of this work and we'll talk in the beginning about how page views don't matter because we're doing everything on social media, I don't know. And show them that like, hey, you're not gonna have a long life with this funder if you're not being strategic up front and educating them on what the outcomes actually are. Right, such a wonderful point. And I think it harkens back a little bit to what you were saying, Jamie, about brand data, another one of our upcoming studies. What do comms teams in particular know about the world? You know, this is sort of a big shift right now is comms teams, you know, they're how we get the message out but they're also very well positioned to listen. And so how can comms become a resource, a service provider internally for audience insights? And I think the more we exercise that muscle, the stronger it gets as comms teams, the more authoritative comms leaders can be in their organizations. And so it's not about governance and nagging, it's truly about, this is where the knowledge is in the organization about what's happening right now. And so it is valuable to have those voices in the room way at the beginning of a new research training. Thank you for having me folks. Thank you very much. Sorry I have to jump right at the hour but please do feel free to reach out afterward. I'm not hard to find and Stefan can share my contact info if he needs to. So thank you so much. Take care, glad to be a part of this. Thanks. Thanks, Jamie. All right, bye-bye. And thank you everyone else who attended. We are going to wrap up unless there's any immediate final questions but we really appreciate everyone's time and attention and we're excited to roll out the studio to the community. So please follow us, check us out on LinkedIn. We've got some of our credentials there but again, we're also not very hard to find and we're really excited to get more and more people involved in the studio. And so we'll keep this community apprised as the studio continues to grow. Take care.