 Alrighty, awesome. Well, thank you for coming to our birds of a feather, the first one we've ever done. We titled this, What's Interesting About Open Source Software Developer Communities to You. And we intend for this, as the title suggests, to be a conversation about what you consider to be important in open source community data visualizations and more pointedly, what you're currently thinking about if you will be providing a couple of leading questions that we are currently working on to kind of set the stage. But any conversation about what you're currently working on is absolutely welcome and we're excited about it. Next. So this is just a formalization of what I just said. So we wanna work on our incubating thoughts all together, get a bunch of brains in a room and talk about what we're doing. And then maybe if it's interesting, discuss patterns and anti-patterns of community analysis, stuff that you've noted in your experience that definitely works, signal in the noise and stuff that misrepresents what you're trying to do or that you think is ineffective. A little bit of background about Callie and I before we get into our own conversation we are a part of Red Hat's open source program office in the community data team. My name is James Constell. And I'm Callie Dolphy and I'm a data scientist on the open source program office at Red Hat. And so for the last like year, two years, we've been working on a project that is focused on visualization and analysis of community data. In the last nine months, we've really bootstrapped in the direction of a formal project with three components all titled under the project heading of OSS Aspen. The mostly publicly facing component of that is a data visualization web app called 8Knot. And we do a lot of open source research with another sub component of OSS Aspen called Repel. The forward facing application 8Knot interacts with the database from the, or uses a database from the Chaos Foundation, Augur, which creates a relational, post-crest database of developer events. So community data commit logs, issues, pull requests, interrelationships with a lot of meta analysis of those data events built in. Yeah. Yeah, we can, and we can show that afterwards or during, yeah, come join, we just, the boss is just gonna be, after we finish doing our introduction, just a conversation about this, so come join. Yeah, welcome in. A little bit of a clear cut roadmap is we started using Augur as our data set in like 20, 22. It's an awesome, it's an awesome database and data project. It gives us really convenient access to a lot of powerful data that we've been using for research and visualization. 8Knot is an analysis platform that right now is mostly visualization centric, but we're building in network science tooling and it's Python all the way down. So it's gonna be focused on bigger computation workloads as time goes on. We've covered this for the most part a moment ago, but just to reiterate, 8Knot open dashboard, repel open research that feeds into 8Knot for public visualization and consumption. So to move directly into the boff component, the real meat of what we're trying to accomplish here, we have some stuff that we've been consistently thinking about questions about open source analysis that require a bit more delicacy or are difficult to start with beyond time series metrics. So we have like eight questions that we've been thinking about internally that we would love to share and get your thoughts on, but this is first and foremost, birds of the feather meeting. So just setting the stage of the stuff that we've been thinking about, we've been focusing on project risk, affiliations, what companies and individuals are interacting with a community at the contributor level, GitHub data, anti-patterns that we're trying to avoid in our analysis and not mis-publicize a pipeline that doesn't make sense with an ethical or scientifically robust paradigm, sustainability with respect to long-term project and ecosystem health, project potential, identifying nascent, up-and-coming projects. It's some awesome work in Red Hat, some emerging technologies group that looks at that problem. And last on the list, but certainly not lowest in precedent is anything else that the members of the community here find especially interesting. We would love to talk about it as well. What do people wanna talk about? This is when everyone gets involved. And so for each of these sections, we have different like other leading questions and pretty much from here on, what we can do is go pick one of them and just start openly talking about what type of things this can go between like actual, like what is the data we're trying to look at here? How are we trying to analyze it? What do we care about in different strategies? We can, like I said, this is all gonna be an open conversation. So let's just, I said even one person, we can pick one and get started. All right, thanks for asking the question. Yeah. Yeah, so this is like, this is something that like personally I've been like really interested in is like trying to figure out like for different communities, what do people view as like the biggest risk signals of whether it be like some type of like, maybe it's unhealthy behavior within maintainers. Is that going to be a really big sign that shows maybe a potential issue within the community? Is it like, or is it like commits activity? Is it like how many like events, if like the event attendance or the amount of events around this community have started to go down, but trying to just talk about like where, where do people see risk to be in their communities? Yeah. And I feel like this kind of goes into something that came up during, oh yeah, I probably should. Just for the sake of the recording. This actually kind of came up during a discussion during ChaosCon that like sustainability was like, okay, that's like the long-term trajectory risk could be something that could be like a more like a downturn over time, or it could be something that's like a specific stop point. Like something that came up was the idea of changing and licensing. Can be something that can just stop an entire project in its tracks and change its entire direction. So I guess it's like risk could be seen as the inverse, but it could also be like very specific like spikes that could cause dramatic change within the community. Yeah. So guys, I kind of jumped into the middle of things. And so I understand the general purpose of Chaos project. But so we just did a talk on the map of science. And so I actually take a like different view of users. Like there is much broader view of data mining and open source projects, right? So you can like, correct me if I'm wrong. So I understand that you basically guys are focused on kind of open source for the sake of like understanding how open source works and as open source maintainers and so forth. So there is a narrow case. The wide cases, what do you use open source for in general? Right? So I'd like to have from open source science perspective, I'd like to find, you know, for my post science project, all the open source projects who serve science. And there the sets of questions are, for instance, right? If you want to pick up a project for science, you want projects which have contributors from multiple organizations, right? For instance, multiple universities. I want to ask a question. Do you have collaborators from multiple universities? Do you have papers? So I want to have rich metadata associated with open source projects. In addition to open source software, I want to have data sets, machine learning models. You know, instance like this. So like basically, from what I understand, Lynx Foundation is building, they're gonna have something like this for their own use and then you can overlay individual use cases. So I wonder if you guys consider this database, you know, general purpose, right? For how can, for instance, someone ask questions about like how this open source is for science? Can you identify, you know, can you reach it with, join it with additional data sets from like chance of recognition from, you know, open source mentions in scientific papers, things like this. So can you talk a little bit about what are the plans for this, you know, accessibility and so forth? Yeah, we can talk a little bit about, I mean, Augur specifically, whenever it comes to the data that goes in there, it can be used to analyze any style of open source projects. It actually has some data, has some tables that are specific towards academic projects. And so, but you can also like say like, it has anything under the sun that you could possibly want around repositories. And so I feel like you can kind of start to break down the question of, okay, there are a difference in how we should analyze maybe more, whether it be viewed as like corporate or more like those style of projects versus academia style. And like does risk and these things look different from those, like those two is like different sides. And so the data is, I would say is the same, but how you'd analyze it is probably gonna be really different. Cause you're not gonna care whether, I mean, you might care about what companies are involved but it might matter a lot more. We're like, okay, what universities are involved? What are the, what's the research around it that's not just specific towards like what's going on in the individual project? Yeah, I'm gonna give you an immediate example. I saw basically in the open source science we're concerned with what we call abandoned ware and graduate students write a bunch of scripts and then they graduate or professors move the job. So this is so typically, right? If researchers write software, it doesn't have documentation, it doesn't have tests, right? It doesn't follow best practices and then it's abandoned, which is very tragic. So ideally, you want to monitor, like when they start committing code without tests or documentation, you want to warn them and you want to teach them, right? Like when we talk to replete another, like this is a golden age of programming, you can actually tell them, go to chat GPT and ask it to add tests to your code, right? Like ask it to document your code. I mean, you can do amazing things now. So like, I personally like to figure out, like can we monitor, can we put a GitHub hook on it? And so they started doing kind of committing, bad code, tell them to do something, point them at things and obviously, you know, like if they're about like abandoning it, like do something, you know, tell them to like call for community support. I mean, I think there can be a whole bunch of fun things you can do to prevent, you know, abandon aware from happening, right? So I can kind of, I'm curious, right? Like now that I hear about it, what you can do. Yeah, and like another question, and we can go here as well as like, okay, we're trying to go back to like the overall question of risk. Like what is the differences within some of the people in the room that might be participating in more academia project? What does risk look like there? Versus what does risk look like in different projects? Your Kubernetes, your projects that are deeply tied to upstream products. And so I just wanted to kind of like see how we can like tie things back to like some of the overall questions. What I was hearing from you was that you were identifying a couple of axes on which there is an overlap between the risk that you might see in a project you're especially interested from a scientific perspective and some of those are less like immediately tangible like code test quality. Like looking from 30,000 feet, you can maybe look at coverage, but quality of tests can be difficult to measure. That's where it's, yeah. But it definitely like speaks to me that there is an intersection there, which is a measurable indicator of risk in some respect. Well, and also to add to what they're saying too, the whole point of project chaos is to devise atomic level metrics that you can string together and form whatever question you want answer, you know, you've got a hypothesis, what data am I looking for to prove that hypothesis? And really, Aspen and Aitnott and Augur, all of these tools are following that line of thinking because every community is different. Every ecosystem has their own idiosyncrasies. So the short answer to your question is yes, you would be able to use this to do open science related things because that's the whole point. These are trying to be as community neutral as possible. And I'm gonna basically stand up and hand this out. Yeah, I was gonna give a little update for people who came in late pretty much. We did like a very basic introduction of like the specific work that we do. And this buff is just to literally talk about different topics and see what people think. These are the questions that we have like about six different groups of questions. And so yeah, like we're just kind of, this is a conversation, like there's a microphone and technically we're the ones here, but like everyone's, this is to talk about, like you can also, if people have more things, they wanna talk about for risk, we can go there and we can, or we can move on. I just wanna, this actually might be a good transition point to see if anybody else once has anything specific around risk they wanna talk about or we can move on to like some of the next topics. I'm actually gonna get a timer for them we can get an idea of how we're moving since we have good discussion going on. And just raise your hand and I'll bring you the mic. I was thinking that what we could do is just go through the questions we have and not try to address them in that moment, but just so that you see the full breadth of the things that are on our minds and we're, like I said earlier, for those that came in after I said it, like that final column of anything else that you're currently working on and excited about, we'd love to work on that altogether. So there are a couple categories, risk, which was on the board when you came in. Affiliation, one of the things that we were looking at is how can we, for the general case, look at affiliation inside of a project. So looking at the emails of all the contributor events in a GitHub repository, commits, issues, even if they're not, even if you can't resolve a GitHub username to an action taken, like if a user, or excuse me, an email to an action taken, you can just have a username and you know all the emails they've used. How fine grained is useful affiliation data? Is the presence or lack thereof of individual groups, especially important at a magnitude level? Sustainability is, I think, the third of the six topics. Could we consider popularity as a part of sustainability? Sophia Vargas had an excellent talk at ChaosCon talking about sustainability and the long-term staying power of a project. Could popularity be a part of that? Is there a way to measure that? Meaningfully, in the general case, without relying on more ephemeral measurements like down-modes, which aren't always a great indicator and are often tricky. Anti-patterns, what we might have tried to do and has not worked gotchas in the world of like trying to get specific data points. And then project potential, I think, is the final one of these. So what indicates that a project is up and coming in some respect? If you were to see a project in its infancy, what might tell you that it's going to be something major? Is it an influencer effect? Is it how it exists within its ecosystem? These are just the food for thought questions. So if anything spoke to anybody that really energized you, we'd love to attach to that and think about it together or if you have something that you've been working on that you'd love to share. And just as a, I'm going to start doing like a five minute timer and then right at the end of the five minutes, we can just as a group be like, okay, do we want to move on? Or if we want to stay on this topic just to keep conversation going. So as a community manager, I'm not so much concerned about like project metrics, more about people metrics. So I don't care about like lines of code in a repository, but I do care about pull requests and who's making them and who's reviewing them. I care about who's connected to who and who's connected to the most people. Or we saw the graphs earlier with the projects and the different ways of displaying that, being able to see that for my community and see who's important and who's upcoming, who's becoming more and more important. But more importantly, it's what do I do with that data once I have it? Like how do I turn those numbers into an action item for me to do or for somebody in my community to do to make my community better? So a lot of this, like if you just take project and substitute it with community, really speaks to me. I have a follow up question then, like what, if the people is what you're really concerned about what is your direct, like what, like if we're taking data aside, what is your questions about the people? Like if you can, if you have any question answered. I have a lot of questions to answer. But the big question is today, who do I need to get in touch with and what do I need to talk to them about? Awesome. So I feel like you can, I'm just kind of going off of the stuff that you have said and like seeing like, okay, whenever it comes to pull requests, I think about, okay, who created the pull requests and who are the maintainers that are involved with it and trying to see those activities levels. I'm gonna, I've been really focused on like most people who've talked to me this week knows that like my brain keeps on going towards maintainers. And so if we're seeing those activities, okay, what are the contributors? Who are the people opening it? And is it dispersed or is it only a few specific people? And then looking at that from a maintainer standpoint is like, okay, is only one person having to carry all the weight and do I need to go talk to them and figure out if they how, like where, how are they feeling or starting to figure out. And I thought, who said this in chaos kind of like, okay, so these are the contributors that we're seeing often maintain, there's a maintainer that's being really heavily dependent on how can we connect those two and maybe transition some of those contributors to now be on the maintainer side and like have those people communicate. Let's, yeah, yeah. Is there a way that that data could work? Like I'm trying to think of a framework where this would work in an anonymous sense because one of the things that we're committed to at least for a publicly available dashboard is that we're not gonna tell you individual GitHub IDs in the first place. So is there a way that we can, from your perspective, localize that kind of activity to like a group without naming names? Would that kind of thing be useful? Right. Yeah, let's, do anybody else have some thoughts around this? Cause it's actually, I feel like. But I'll solve my dashboard. Yeah. Yeah. Let's see. Anybody else have some, yeah. Does it matter how big the project is in order for you to be able to draw any, like some of these metrics out? Like, does it have to meet a certain threshold? We have. If it's only like two people working on a project, it could be healthy, it could be dead. I mean, what's. It's a really good question and it's all context dependent, I think. The consumer of the project, if they're also, or the consumer of the visualization or analysis, if they're the person working on the project, will know to what extent some behavioral pattern feels good, doesn't feel good. So from the like consumer of the visualization person who's also in the community, that's pretty obvious. Looking from the third party perspective, it can be difficult to know what like, what bells and whistles there are for like, oh, this does not look like it's adhering to what I would imagine to be healthy. But from a data perspective, the tool we have collects the data on any git shaped repository. So it's not limited in the sense that like it's, there's too small of a repo. Okay. On top of that, I would say that as, that the two accesses that I always, if I'm looking at something from a third party where I don't know that much about the community, I like looking at, okay, we're trying to figure out where I feel like they are on the maturity cycle, cause that's gonna change how I view some of the metrics, as well as like age and size, because those are gonna like, just because somebody has like, or the project has like lower activity, it doesn't mean that it's necessarily unhealthy. It's just more like looking at the overall size. If you look at like a graph that shows overall contributor growth and you start to see that the active contributors went from, I don't know, 50 per month and goes down to 10, that's very different than a consistent, maybe 10 or 12 that has stayed in that space over time. So I think it's important on those type of things. You can look at the different sizes from these views. You just have to take the lens of it. And then if you're personally like knowledgeable at the community, something that we've started to see is that when people start to look at these graphs, there's different anomalies where they'll point out and be like, oh, I know this happened. Like there was an event that happened right here. Major community decision happened and now they can kind of see from a higher point of view how that impacted activity overall in the community. I'm gonna go to Michael. Yeah, I just wanna follow up on your question a little bit. So I'm to think about like metrics that we can actually measure, right? So when you're trying to identify these like influencers in the group, I assume they're probably communicating outside of GitHub like on Slack or something. Yeah. Okay, yeah, but like even let's like assume that they were doing all of their communication and issues before PRs were pulled or they're communicating in PR. So all that information is actually on GitHub. Is that in the auger database at all? So there is like the whole conversations that people are having. Yes, which is something I find super interesting that you can start to see those conversations, the messages on issues and pull requests. So you can see how like the interactions go. Yeah, and have you started to do any analysis on that data set? Cause that seems like really rich for? No, but it's really early in the roadmap. Yeah, I would say that's something I would expect to start seeing within the next three to six months. Okay, cool. One of the things that auger does, which is really nice is as a part of ingestion and organization of the data, auger does, auger has a lot of tooling for message analysis. So some of those deeper insights are going to be really interesting to work with. Relational database structure. It's like you can see, not just you're like, okay, these are how these people are interacting on the issues. You can identify those same people as if they were talking on the PR, if they're opening their PRs versus commits. And a lot of times because of the, like you're not able to identify those connections by like an unstructured like data. They were like doing it in a different structure, but that relational database makes it where like those type of like analysis points are possible and like a realistic way without having to do mountains of pre-processing. Hey, so I've been hearing a lot about like communities and activities. And I wonder how do you define an active community member? Is it someone that pushes code? Is it someone that responds issues? And how do you all define that? It's a good question. I came over here just so that I could talk to you directly. It was awkward to talk to a woman. And like that's, we have our own ideas of how to do it. I wanna make sure that the group, like everybody has an opportunity to speak to it, but from our perspective, everybody who engages with the process of developing software is a contributor. So the people who submit issues, the people who write back on those issues, review pull requests, review documentation, help write codes of conduct, help create artistic resources, are a part of the equation. It is not an uncomplicated question to say, from the peer code perspective, how much do you value the contribution of an issue? Because you can say it has a non-zero value, but I wouldn't intuitively say that it has the same intrinsic value as a pull request. Would open that up to the group for sure. Yeah, because I think there's a very clear differentiation between like a contributor and an active community member. Because a contributor could be a one-time contributor that just like they had a problem, they submit a pull request and they leave and they never come back. And they're a contributor, but they're not an active member. From the literature. Sorry, please actually. I was just gonna say like, I think like comments on a pull request can be just as important as the pull request. So, you know, if you're saying like an issue is not as important as a pull request, I think a lot of like the maintainers get less and less time to do actual pull requests, but they invest a lot, it'd be easier for them to just write the pull request half the time, right? Right. So you, I don't think we should undervalue that. The comments. Totally true. Yeah, so for me, I do distinguish between contributors and active members. And for me, an active member is anybody who has an interaction with anybody else in the community, whether that's a pull request or that's just saying high on slack. That's activity in the community. So they're an active member. I typically put like a one year window on that. So like if you've not done any of that in a year, then you're not active in the community anymore. Yeah, to that point, I think it's also important to not only track the kinds of activity like codes of conduct, code commit within the GitHub repo space, but also monitor the ecosystem around it. Cause there might be members who might have binding votes to that community who may not be pushing any code or directly committing, but in terms of the project sustenance, in terms of funding for the project, it's important to know who are the key influential players who have a say in that community's future or its sustainable health. Yeah. Yeah, I think that shows kind of like the next step of like, okay, once we're starting to get comfortable with the GitHub style data, okay, how do we start going into those next steps of like, okay, how do we look at that data? Is it like how are those different communication platforms? And that's gonna be a lot more like community by community like unique. To your point of like, how do you know which of those type of contributions are better? And that's something that we are also trying to look at from a graphical approach. Not sure if you were there in one of our talks, but it's closely related to what Callie and James are doing, but from, which is why in a more statistical approach, when you're like trying to visualize it from a graphical perspective of a community, you try to look at the edges between all these nodes, which is why we did this approach. And those edge weights can vary depending on the type of contribution. So let's say in our view, a PR request is has a higher weight compared to just like commenting or just reviewing or just opening an issue, but never actually actively putting towards the code, right? So that's kind of how we also try to like add weights and values like James was saying for each of those type of contributions. So just like adding some more light into how we can try to differentiate those. I'm just a little interested. Maybe I missed the beginning, but like it seems to me like these questions and abstract of what you want to achieve are a bit difficult. Like in your case, you had a very specific use case where like, hey, I want to understand somebody who's like helping people with initial contributions because that's valuable and I want to support them, right? And I could see like in our case, we'd like to see not the people who made their first commit, but like maybe their 10th commit or like who's people who look like they may become long-term contributors and how can we help them? So like understanding that versus like, I think I'm thinking as we were talking but it gets a little dangerous to start to be getting into, well, we've taken a rank of all the contributions and actually, well, this person's ranked high, this person's ranked low, that's gonna lead to people gaming the system and what do you get out of that was my question back to like, what's the goal, right? I guess that's kind of why we're having this conversation because I know that like honestly, that's like the whole, that's why we wanted to do this is that everyone has different perspectives and like goals and what they're coming from like visualizations wanted to hear what different people's questions are and as you're saying like, what are the concerns about measuring these things and why we should or shouldn't do that? Like there's a lot of care that needs to get put into like putting out visualizations because it can influence action. Yeah, no, and going back to that, I think that's a great point. One of the biggest like most common, let's go like bad practices that I see is jumping directly to the goals but don't think about what's the problem that you're trying to solve within your community. So a lot of the times we jump into the solution without initially considering what's like the thing that I need to solve first and then you can find the tools and tools can be like events and can be like sending a message to someone or whatever to solve that like problem statement that you're finding in your community. And then so speaking as like with my red hat manager hat on what are the overall goals for why we started this project in the first place was to answer broader questions about sustainability and risk for ourselves because if there's going to be a trouble spot in a project that we are directly working with we would like to know about it. And then our customers and our partners that is the number one question we get. I wanna know, if I'm gonna be doing this whole open source wacky community thing, I wanna know how we can tell if this project that we wanna work with is gonna be around tomorrow. Literally I've had that question three or four times. So I appreciate what you're doing because and I think we could get there. But right now our goal is more like how do we make sure that community is aligning with business interests because in the past we've done this with our guts as community managers. And now we wanna be able to kind of quantify that quickly and say from a business standpoint this feels like a solid community to get involved with and go from there. So that's, I just wanna kind of give the perspective of where we're coming from now it is certainly not the only direction we will be at but that's where we're kind of coming from now. Yeah, so I'm Tim, I was a little late so I may have missed that and I work with Alexi on open source science. So what I could see here as a use case as we build out the map of science that we've been kind of brainstorming I could see like a simple traffic light system where you have the open source project in let's say cancer research, right? And as soon as something goes into yellow we get like a little ping and says oh there's something happening out of there could be hundreds or thousands of projects that are in such a region of science, right? And then that could be interesting for funders that could be interesting for other organizations for neighboring projects up and downstream and all that stuff. So I could see something like that. One difficult thing if I can just directly comment on that is like from a metrics perspective and from like a hyper ecosystem perspective the yellowness of the project is almost impossible to quantify in a lot of ways. So just green and red then. But yeah, like. I guess yellow could be just like okay what are we starting to see like identify as like okay should we pay attention? It doesn't mean necessarily something is wrong but maybe more. That's something that I've always been looking at. It's like okay what is a signal to look into more because there's a thousand and one things that we all have to do and look into all the time. And so it might not always be that there is a problem but it's something to mean like this is a little bit off than what we've normally wanted to see. Let's look into it. And if you spent five minutes looking into it and you're like okay I feel comfortable with what's going on here. You could move on but there's no way of it might just be something to help manage time. Oh yeah. And with like I feel like with academia it makes it a little bit easier because you're starting to you're seeing a very similar style of projects. Versus like if you try to compare an academia like to use the same standards to compare an academia project to a large scale project that has multiple different companies that are deeply involved with it that's comparing things that are completely different. Yeah. I think from Brian's description that helps a lot like if you're looking at it through a particular lens because if it's to say which projects do we want to invest in that's a much different lens than how do we help the projects help themselves or something like finding the people who can help the project become more successful. Versus like because the risk there's like there's risk of us the project going bad or there's like risk of anyway depending on the lens you can you really have a it's much different and I think taking one lens is probably going to make your question much more manageable because otherwise it's like in what you're just saying like an academic project you might just say well those projects actually are not ones we'd want to invest in because of you know once the person graduates they're over right. So I don't know but it's very interesting to have that that lens. So just bringing it back around to community for me I do get a ping if a person has like if I set my threshold of 10 commits as being like a significant milestone I get a ping when somebody hits their ninth commit to say hey this person's close reach out to them help get them over the edge or if somebody's been active and then suddenly has not been active I get a ping saying hey this person's not there anymore you might want to reach out and see what's going on. I've had a lot of success with that one just reaching out saying hey notice you haven't been active hope everything's going okay let me know and just find out you know they took a break and they're you know now they're wanting to come back and they'll come back and they'll participate again. One of the things that I am hearing and I'm tying a couple thoughts together here is that there's massively different operationalizations of these different metrics like metrics taken as like stitches into a patchwork quilt of perspective fuel different actions because what I'm hearing is you know from the Red Hat customer perspective they want to know what couple indicators look at sustainability and what I'm hearing from from the community manager maintainer perspective is how can we enable our contributors to continue to be a part of it and to invest in longevity by being a human element. And so one of the things that I'm at least finding with metrics in this space is that Callie gives kind of a talk and I'm borrowing some of the thoughts on this which is like this is such a complex space there's so many signals and there's and they're all relatively good signals for what they're trying to accomplish it's difficult to narrow down into two to three that really pinpoint the objective that you have. You know, look, one of the things that we're working on is making it so that people can having a platform for people to answer questions when they come up when they have the really specific crystallized idea of how they can optimize for their particular problem. And so this has been really productive in hearing that from people that are working on this that there are like really specific personas working in this space that can make use of the data and metrics. Yeah, and this will be my one slight plug that it's if people are leaving this with very specific questions that they have around their communities or a lens that they wanna look at this data please open an issue on, we have OSS slash ASPEN it is the eight knot and that's the dashboard and we're really looking for contributions like you would think that's always technical but in this case we really want to hear what are the different questions that people from completely different personas me at sitting at my desk and trying to come up with the different questions isn't gonna provide the different lenses of analysis of people who have different experiences are able to do. And it's been really awesome to have such a difference of perspectives coming and like having this conversation everyone's looking at it really different. And that's great. Like we shouldn't all be looking at it different because our problems are different. I will just say we're at time folks. Yeah, we are at time. This has been awesome. Do we have time for... Maybe I'll just make one point, right? So like, so, you know, I run a lot of open source events in the community and often VCEs come and founders and basically what I wonder like I'd like to turn on the tables because there is always this expectation open source contributors, all of them something they should do something. So if you take the measure, it's like it reminds me of operation and research. How more productivity can we squeeze out of the workers, right? The question is, is the factory fair? Do you pay well? Do you have maternity room? So I wonder, so Michael mentioned like he wants to know about the activity in the project and he can do things. Can you add the palette of the actions that the owners, the employers can do? So for instance, when somebody stopped contributing, what are you gonna do? You're gonna talk to them. So you're gonna provide encouragement. Do you, are you gonna give them money for instance? Like maybe they're over time. Like what is with your power to do? Can you encourage them? Can you give them interesting things? Can you connect with the interesting people? Like what are the set of actions you can take to make this project healthy, right? So your measurement project is getting unhealthy. The question is what can you do about it, right? And so can you include it as a data set, right? Because it's very one-sided, right? They can do personal doing stuff. Like what are the managers? What are the owners, stakeholders doing to keep them going? What is the set of actions you tried? And can you log them alongside the observable data? We try to talk to this person. We give them the money. We give them a vacation. We send them to a conference, right? And so here's what the stuff we did for this project. We gave them a grant. Like can you align incentives with work and tell me if it's getting better? Yeah, I would say this is when like the, where is the line where the metric and the visualization stop? I would, in the sense that like this is going to be able to inform you about what's going on and being able to, in a lot of times, I view metrics and visualizations as more of a time saver than anything. And being able to point out different things that are very unrealistic to do without those and then starting to really lean on the experts, the people who've been involved with community management for 20 plus years and have been in and out and has seen things happen. That's when, like that's whenever the visualizations can inform people on what's going on and then you can start to lean on like the knowledge that you have around your own personal communities, communities in general because a metric isn't going to be able to tell you exactly what you're going to be able to do. And so I guess that's how, a lot of times, how I view like visualizations and metrics. It's like, let's make, be able to inform ourselves better but not take like the human out of this cause it is, like we are, this is community. These are like, that's what we're looking at. It's the people. So yeah, I think that's a good closing point. Thank you.