 Hello there, George. Hello, everyone. Thanks for joining. We're going to wait a few more moments for folks to continue to pile in. And then we'll get started. Hello. For those of you that have just joined us, we're going to wait a few more minutes for everyone to pile in. And then we'll get started. All right, it looks like participant entry has slowed down. So we'll go ahead and get started. Thanks everyone for joining us. This is the CNC FTC meeting for today. July 18. As a quick reminder, your attendance and participation in these meetings means that you comply with the Linux foundations anti-trust policy notice if you're curious what that is. Your all is on the slide. I'm going to endeavor to fulfill the role of presenter and Amy's absence today. You're here. You know the meeting logistics. We have several TOC members present here today, but it doesn't look like we have everyone and I don't think we're doing any sort of voting so that we would need quorum. So this is going to be a discussion. We're going to be talking about sandbox annual reviews, how annual reviews are conducted. What should the evaluation and process look like when we're considering annual reviews for sandbox, how we should vote. The reason why this is coming up is, as you all know, we have a lot of sandbox projects, and they are to submit annual reviews to the TOC, at least annually. The result of that we now have over 30. I think last time I checked we have 35 open sandbox reviews. In the past, when we had about 10 or 12, it was very easy for the TOC members to either pick up one review it engage with the project check in with them and then do a report or in bulk sitting through a TOC meeting reviewing all the content of the annual reviews and then doing a bulk approval of them. So it's kind of how we've done it in the past. Through various conversations with TOC and tag chairs and tag members. We'd like to return sandbox annual reviews to the tags. In order to do that, we need to ensure that they're community driven, and that we can still quickly reach a position where the TOC agrees to continue the sponsorship of the projects as part of sandbox. I'm fairly sure what the process looks like some of the recommendations include when a tag is complete reviewing a sandbox annual review that they could provide a comment on the PR and any concerns that show up as part of that review go to the TOC liaisons for remediation. So those are just some of the ideas. Let me go back one slide. In the past. We had annual reviews that were that were executed by tags. During that time there wasn't really any guidance on what needed to happen as part of an annual review process by a tag member or by a TOC member. And in fact we don't have this clearly documented today. We do have a template that sandbox projects fill in when they submit for their annual review. But it's not necessarily consistent across all the projects that are completing the various areas. Some of them include a little bit of background and detail about what the project is. Some of them just cut to the chase and start providing us with some content and some updates. So over the course of the past several months we've had a lot of conversations about project house project reviews what do they look like. We do have plans for doing annual reviews for incubating graduated projects at a much later date to have that conversation. So today we're really just focusing on annual sandbox annual reviews. So I'm going to pause and I want to kind of open it up to discussion a little bit first to see if there's anything else that I'm missing and kind of setting the stage for how we're going to potentially explore conducting over 30 sandbox annual reviews this year. Do we want to talk about mapping areas of the landscape to tags now or do you want to move that later in the discussion today. We could probably go ahead and get started with that. So let's figure out, let me go back a slide. How to conduct the annual review and evaluating it and voting at the one thing I didn't include in here clearly I should have is the alignment of a project to a particular tag or a domain within the landscape. Right now that way the annual review process works is when when a project submit an annual review. They're not affixing a label for the corresponding time that they belong to and this is an ongoing question and activity that the TOC engages in is looking at a project figuring out what the appropriate tag is figuring out where they are in the landscape. So what we've done this go around is go through and assign what we believe to be the correct tag for each project. I'm catching up on chat real quick. Josh says it might be make this easier and tags to have projects involved in the tag on an ongoing basis yep that's part of the design behind moving the annual reviews back to the tags present their latest for a couple times a year. Bob has a plus one on having the tags more involved. This added in it's work in progress adding metadata to the CNCF landscape so things will be associated properly and more automated. We have some rough mapping Nikita has posted issue, thank you Nikita and Richie added a strong opinion that we need to have good guidance for the tags. Agreed. Sounds like everyone's in agreement, we can move these to the tags, we need to ensure that we have enough process in place for consistency. And some of the tags Karina thank you for bringing that up. Have a hard time keeping up with the regular duties, and it has not yet been resolved across all the tags so we're starting to put additional guidance in place to hopefully assist them. In the past, if I recall correctly in tag chairs if you've been around a lot longer than me please chime in. When in your reviews for sandbox projects were with the tags, they did receive some additional contributions or attention from projects through those regular engagements because the projects were sent into the tags. But since they were moved out, we started to see a drop off and a lot of that. So catching up on chat. The tag labels and landscape for projects has an ongoing issue associated with it and Bob mentioned that having the liaison sit with the leads to go through a few other reviews so they have an idea to look for etc. Yep, that sounds like a good how more integration with the tags and the to see the better. Okay. Any other comments. I was also thinking that we should probably have like to reach these point guiding checklist. So that the arbitrary across tags and it's all consistent and I know we've talked about probably doing something similar for the due diligence process that the doc does for incubating and graduating ones and maybe we should do like for the annual reviews as well so you can check this. Yep, I agree. Let me jump back over here so definitely sounds like alignment of the landscape and the domains within it needs to happen to expedite this little bit for projects across tags that's an excellent question and this is an ongoing concern. We do have several tags that don't operate in a traditional technical domain as we see here that's laid out on this tag contributor strategy and tag environmental sustainability or two that really stand out as examples of this, but you can also see that we have some projects that tag tags where we have a lot of projects associated with them tag runtime is is a really good example. It's not realistic or feasible for us to expect tag runtime to take on every single one of the annual reviews that are submitted here. So we also need to be able to write size, even though a project may be associated with a particular tags domain area. If they're overburdened. They might we might need to shift something elsewhere to another tag. Ricardo thank you for the catch on open ELB let me make that change here. I'll have to go back in and go back to the table on the issue if a TOC member would like to be able to do that that would be lovely. So, moving back to the how. So, based off of previous discussions that we've had with the tags in the TOC and how the TOC has historically reviewed sandbox projects we've done it at least two different ways. So let's pull together some initial discussion areas around what does evaluating a sandbox project look like from an annual review thank you Nikita. Look like there's two slides of this. So, if everyone is good I'd like to kind of go through and discuss them and see if these make sense and then also see if there's anything that is missing or if, if we need to try this a different way. So one is long term planning. A lot of sandbox projects come to us when we accept them into the CNCF with a roadmap or some indicator of what it is that they're doing. I think it's reasonable to expect that the sandbox project would still be sustaining and maintaining that long term planning and roadmap that they have. And other portion of that is that it doesn't need to be highly detailed or very specific it can be against releases. In some cases it could just be something that's more about moving milestones. The next portion is that we're starting to see some of the adoption of those projects pick up, not necessarily that they're getting a lot of adopters but that they're getting more attention that they're seeing more folks interested. And then development is ongoing and progressive towards version releases so that they're starting to think about how they're going to cut releases they're thinking about how they're doing versioning one of the requirements that we have is that they are versioning their releases. And that the project doesn't actually appear to be in maintenance or sustainment mode so we see some activity going on it looks like they're making incremental progress for some projects it might be a lot of progress and in other cases it might not be a lot at all. Does anyone have any questions or additions or suggestions to the three that I have on this slide I tried to group them together and then project viability around. Are they experiencing some form of community growth and development associated with that that they're starting to balance project development with contributors that are interested. Individuals that are filing issues or PRs to improve the project for their particular needs. We also want to see more project governance associated with them some sandbox projects come in with very minimal amount of governance but as they see their community grow and start to develop. We should be seeing more project governance be put in place something with a little bit more rigor to point to as they experience. Stretches in their current governance practices and then just general self self awareness, they understand where they're at from a maturity perspective they understand what else they need to do if they're progressing towards incubation and how far they are from it. And they know what they're not they feel that they're ready for it so those are kind of the couple of areas that I've pulled out of our past discussions on evaluating sandbox projects. The intent is not necessarily to develop this as a checklist more is a procedure around how we look at them. The problem with some of the checklist especially with sandbox projects is because they cross multiple domains. What may look appropriate for one security project is not necessarily going to be the same for a networking project depending on how much attention or novelty they have and how interested contributors and adopters are in that. Any questions, comments, additions, clarification. Yeah, what question, what would be the guidance for a project to continue remaining in sandbox and also when the project needs to be archived. Is that within here or is there any. Let me check, not checklist but any guidance on that. That's a really good question. So we have an archive process that's already defined. It's a very late wait. My expectation or kind of where my head is that and thinking about this is if a tag were to review a project during annual review and find that they look like they're in maintenance mode or there's a lot less activity they don't seem healthy it's. So basically like the project is due for an archive or we're not receiving a lot of attention. That would be something that I would expect the time to bring with their TOC liaisons and then at that point the TOC liaisons would be responsible for initiating an archive process checking in with the maintainers understanding what's going on a little bit more with the tag and then facilitating the public discussion associated with it. So we're remaining in sandbox that is a different conversation unfortunately. We are starting to see projects reach a level of comfort with the level that they are currently sitting in and that could be for any number of reasons. We don't actually define when an appropriate amount of time is for a project to remain in sandbox, because it varies greatly but I'm curious what other to see members perspectives and opinions are on that. It has a hand raised. Yeah. Oh, I had a different point but I completely agree with everybody on the conversation side right now. So, Richie. There's probably, it's probably a good idea to codify that we don't have to have everyone growing at all times, as long as there's maintenance going on and people are happy to sit at a certain level, making it explicit that this is actually an option, maybe reduces the strain because they don't perceive a post-majority to have to progress. That's another good example in concern area. So it sounds like whatever it is that we decided from a process perspective we need to ensure that there's accountability for our puzzle and initiating that if necessary and then making sure that there is clarity and expectations that projects may not be leaving sandbox within a year or two years it could be longer. So we need to make sure that that's documented as well. Chris comments that it should be pretty high to archive a project scrolling through chat. Josh I definitely hear you around motivation of members and getting folks to do work. So, Nikita I will turn over to you before I start answering questions and chat. I just had a very small point. So since annual reviews aren't only about checking if a project has had even more also giving guidance on how they can grow, kind of providing a mechanism for them to go into incubating projects I was wondering if it makes sense if we could add in the annual review process and questions and just asking the projects what kind of help they're looking for from CNCF the tags or the community in general. And maybe it doesn't have to be a checklist but kind of just rephrasing some of those things. It's not like if they're doing anything to grow their contributor base. So this is one of the things that we look at when we look at incubation. And I think it should be okay for product has not grown significantly but if they call out what they're trying to do to make this better probably tax can provide guidance. So it's kind of free phrasing in terms of just where do you stand right now with the state of the product the more of how can you grow instead. Yep. Okay. We've got some questions and chat. I received annual reports for all sandbox projects, not yet. So the current process is it's one year from when they were accepted. So projects will submit in your reviews periodically. I believe one of the CNCF boss I think it's sheriff chief year Bob correct me has been notifying projects that they're due for their annual review, Chris. Yeah, definitely we started to do some automation here where we're basically poking projects that they should go submit their annual review and we're kind of basing that based on metadata in the landscape. That sees what when their latest review is so it's kind of a new little process slash automation thing that we're doing that will hopefully spread and notify all maintainers that this is happening that they should have the right metadata. For this work. It should be definitely improved it'll be easier to track all this hopefully. Yep. Kathy. Yes, I think for the community girls. We may also want to look to know to see whether is there's any contributor girls or new maintainer girls, or sometimes you know some project right it has only maintained them from one company. So we may want to look for that too. Um, arena asked is the danger if the danger is high for many projects to stand back in sandbox does that risk attracting contributors since there is no guarantee that they would mature. I don't know that we've necessarily seen a lot of projects in sandbox that want to stay there forever. Because it does take them quite a bit of time in some cases to move levels so I don't know that we have enough data to be able to reason around whether or not there's any risk there. So far from what I've seen based off of the backlog cube that the TOC has for incubation applications. They are receiving contributors there is some progress that's being made but we have not done a good job clarifying kind of expectations for steps to be able to move into incubation we have the criteria defined but I still think that there is a gap between the sandbox application and actually being ready for incubation that the tags could definitely assist a lot of our projects and fulfilling. Bob had replied around Josh's concern around motivation for having enough contributors within the tags to be able to take on annual reviews, definitely making some of the responsibility here would help but Josh, would you like to add anything else. Yeah, now I fit it and I can come off mute I just wanted to just mention that like I think this is a great strategy and I would love for you know my tag to take on more here. But yeah, I have to be honest and just bring this up with you all like, I don't know who's going to do the work. Our chairs are not like, I'm not even officially a chair and I'm working hard to get that, you know, set but. Yeah, I could use some help and like how do we maybe assigning them work but I think a lot of these contributors aren't even they don't even have the time to really do this. So I'm a little concerned. That's I just, yeah, I don't have any answer I just wanted to share that. No, it's a valid concern. And actually, that's skipped ahead a few slides. So that that's actually something that we should be bringing up is the sandbox reviews are not to have the same intensity that we do for incubation or graduation because these are still very early projects they're still experimental they're still trying to find their footing in the ecosystem. So ideally these are pretty lightweight. So maybe circumstances I can foresee where a tag starts looking at a particular project starts digging in and sees like, some things seem to be missing, or that they're not making as much project as one would expect given the amount of attention that particular sub domain area is receiving those in my mind would be conversations with to see liaisons. There is definitely an opportunity here for us to try to both leverage annual reviews to increase the contributors and the members within the tag community and ecosystem to better distribute this work like, you can see if a project has gone through an annual review and had a positive experience and engagement they might return back to the tag to pay it forward and ideally that's what we'd start to like, we'd like to start saying with some of our projects. Ricardo added in that maybe using something like session eyes would be excellent for managing a lot of this. That's certainly something that we can talk to the foundation staff about to see if that works right now. They're through PRs. So with the template that we have it does give us some information I know session eyes does have some questions from a reviewing perspective that we can leverage, but making sure that we have consistent information, as well as we've talked in the past about automating some of the information that goes into a sandbox annual review understanding what are the dev stats associated with it. So that's something that we can definitely look at. I would prefer that we do a more manual process first before we we look to automate it with other tooling to assist it because it's easier to iterate on manual activities. Let's see catching up. Okay. So we have a lot of things that we need to figure out still Josh you came off mute. Yeah, I just was going to share. I love the idea of giving something concrete what you said before about the having a project being able to tell people hey this is how you contribute your review it. And I was just going to say like we've started kind of doing on my tag and maybe we'll create an issue for each thing and guide in that issue, the reviewer and the project along. Yeah. So then the next thing is voting. So in the past, the TOC has done two different forms of voting on sandbox annual reviews. The first being to see members are on a call. They pull up all the annual reviews they review them all the call, and then they vote to accept them all in. On other occasions and more recently, the TOC members each took on several projects, they reviewed them and then they prepared slides based off of the PR content, and invited the sandbox projects on to the public meeting where we discussed them to provide to answer any questions or add additional clarity to some of the comments that they had on the PR such as those that Nikita pointed out, needing support from the CNCF or TOC members or even a tag. That worked fairly well doesn't work when we have over 30 think we have 35. So we need to figure out a way or a process that allows the tags to provide recommendations to the TOC and the TOC to take action on it, we could do that as voting through get vote on the PRs, but that that's a lot of PR voting to do. We could do it in bulk and mass so providing a due date associated with them or periodic every twice, or every other month check in on the current status of annual reviews, if they're placed in a state that says it's been recommended by the tag and the TOC can just jump on a call and provide that voting. So there's a few different options there. I'm curious if to see members of you have a preference based off of the new get vote that we've been leveraging and past sandbox reviews. The thing I'll share is that I was at an end user company at some point and there was an architecture review board. And, you know, the enterprise architects or people, you know, with significant expertise were on the board and we would assign individuals from the board to kind of mentor or support a project and then you know when they came to present to the ARB that person would come along with them. And often that person would give their perspective after the project gave like say, oh, I evaluated an XYZ. Maybe that's an approach like we assign it to the tag the tag assigns a person to shepherd to mentor and then that mentor comes with the project to talk to present, you know, for five minutes or something. But that's a, that's something that could potentially be done. I think I like that it mirrors a lot more of that mentorship and advisory function that we'd like to see the tags kind of engage in for a lot of our projects. Leo had posted in chat around interest in participating in some other reviews. Think about how this could make the work more interesting shout outs at a conference or a blog post or even issuing badges for people that participate is a review. That's definitely something I know Nikita have brought up this morning to me and what the TOC is checking on the status of it but recognizing tag member contributions, particularly around activities that are not necessarily seen as glamorous, but necessary to the function of the ecosystem and definitely appreciate it so that we can add that into our next conversation we have around badges and tag member recognition. You know what that also helps like these people justify it to their employer so that's something to keep in mind too. It does yeah. I mean, I'm just making a note. Okay. What else hasn't been discussed here. What other ideas thoughts perspective opinions complaints constraints you all have associated with this. Josh. I got a lot to say right. One thing that like I'm thinking is like, how can I leverage my tag to be more of a community for these projects like I'd like to not just be a clearinghouse for them. I don't want to make them to come participate. You know, and I just I'm wondering if, if that if we can leverage that to continue to further that leverage this work to further that to like, don't just show up once a year and give your review like you know pay attention to our threads you know when you see a project related to yours come in, you know chime in with your review of it, you know can how can we encourage projects to be part of our community. So some incentive alignment needs to happen as part of like, they're getting value out of the reviews we'd like to see that value returned back into the tags. So that's something that we can look at exploring how do we do that as part of this process. Karina. Thanks Emily. I already mentioned this to you but not necessarily the wider group. As sandbox projects are coming in for review. So as part of the process. Is there another project that it could be integrated with is there. How do we drive more efficiencies within the ecosystem versus asking more people to do more. How do we make each project really effective and maybe blend in more capabilities or etc. That's definitely a good call that could definitely be something that's considered as part of those annual reviews is providing recommendations for projects of where they could potentially collaborate our partner with other projects. Both either within a SIG or within a tag or even potentially extending the cloud native ecosystem further because there are a lot of open source integrations that happen outside of cloud native from an adopter's perspective. So being able to identify those and having situational awareness of what exists within the domain of a given area of the ecosystem would be beneficial for sandbox projects to and could bring more contributors into the cloud native ecosystem. Other thoughts and comments. Okay. So, we need to figure out how do we count for archive. We need to figure out management of projects that appear to be retaining their presence in sandbox far longer than we expect them to or anticipate them to. Highlighting where projects need help as part of that annual review, making sure that we have a positive engagement with them and that we can respond in a timely manner to any request for support. Helping to ensure that the tags receive community growth. As a result of this and recognition to remember badges particularly if they're participating in a review. We're promoting project integration and efficiencies as well as verifying that we're seeing some positive trends for new contributors and new containers to projects as they go down that path towards incubation. So, let's talk next steps then. So the to see has over 30 of these to do. Two tags chairs members and to see members feel about. Let's start with a lightweight process for right now. And the liaisons will partner up with their corresponding tags to review jointly. Some of these projects it can be a sync it can be on a call however you all wish to manage your time. Does that make sense probably for these first 35 that we have seeing some head nods. Okay. We also need to ensure that no single tag is over burdened with the volume share of the annual reviews. Does anyone have a preference or a recommendation for how to divvy this up. I had previously gone through and loosely assigned to see members and there's only so many of us to look through them. We need to ensure that it's an equal distribution among everyone so what it how do you all feel I'd be curious to hear from tag runtime, because they have the bulk of the work. We can get a more help from the other tags. But yeah, you know, we can get started with just what you know the list and start knocking them out one by one and. Okay. Yeah, but but I think a tag environment environmental sustainability offer the help. So maybe we get some help from them. This might also be a good, this might also be a good opportunity to take another look and see if we can maybe shift a few things a few projects which need to review before we start the review process because it is something which can conceivably fall into different tags. Maybe we can either put them on the backbench, or we can we can ask other tags to actively taken where it's not super team. Yep. Okay. Grina you asked if this particular spreadsheet was shared anywhere it is not however on the TOC repo we have a new project board thank you Chris for setting that up. That has an in your review tab that lists all of the in your reviews that are open. Duffy you had your hand raised. I was going to ask if there's somebody has set up a project for our reviews so that we can actually at least all see if they are. And then kind of work through. It's there now Duffy, it may not be perfect but it's better than what we had before. Yeah. So this is brand new. Thank you guys for pulling this together it certainly helps us understand kind of like the breadth of what it is that we have. So let me catch up on meeting chat real quick, stringy. Okay. And then whether or not members outside of official tags and to see could be invited to shadow to understand how these reviews are done volunteer be the issues created for these might help to get a second line prepared for the next round. Yep. That's fair. I don't see that as being a problem. So that's not an official thing that you have to apply for it's more about participation in an area that you're interested in and being available and present to do the work. So that's something Krishna if you're interested in doing this definitely select a tag pick a domain and volunteer to assist. It's a learning process for everyone, especially since this will be new for us. What I'm going to do is I'm going to take the action to go through and try to rough in that lightweight framework if there is another tag member or to a senior member or community member on the call that is interested in participating and assisting or having your ideas captured. Please DM me and slack will be more than happy to take your assistance. And once we have that done. I'm going to shoot for the next two weeks, share that out in the public to see channel and slack. And on the mailing list. That way we can propose like a rough timeline for how we want to get started with this. Or Vegas should be storage. Yep, I can do that now. Does that work. Sounds good. Okay, cool. We have about 20 minutes left. Did anyone have anything else that they wanted to bring up something short and sweet topics questions. Anything else. I have one question for these some sandbox reviews. Is there like a specific timeline or this is, you know, best effort. For right now, let's go with. We'll try to do best effort I would like to try to get these 35 done no leader than the end of September because they have been outstanding for a long period of time and we do have we're going to have more than are going to come in. I think for end of September, I figure next two weeks for getting it written up and then plenty of time for to see liaisons to meet with the tags and kind of partner and sides out on going through this. Does that seem reasonable for everyone. That sounds good to me. Okay. Awesome. All right. Well, anything else. Okay. Thank you so much everyone for attending today really appreciate your opinions and inputs as we try to make this process a lot better for all of the, the tags and the sandbox projects and to see members involved. We really appreciate it. Have a wonderful rest of your day. Thank you all. Thank you. Bye bye.