 All right, Taylor, can you share the duck? Just in terms of order of attendance, looks like Alexis can't make it this time due to vacation. Is Brennan Burns or Joe Beta here? I don't see you guys on the list. Okay, Joe is also on a vacation. So I'll note that, cool, cool, all right. So we got Brian, Jeff, Kelsey, Matt, Quinton, and Jang. So kind of first order of business. We had TOC chair elections. Essentially, Alexis has been TOC chair for the last few years and has done a fantastic job. And we had a new set of elections due to the TOC. And I'm happy to say that Kelsey Hightower will be taking the reins. So at least Kelsey is on this call. So Kelsey will essentially be leading these meetings and taking the role of TOC chair, which also has a seat on the governing board representing the interests of the TOC and technical community. So, Kelsey, feel free to say something or say some words if you like. Otherwise, I think it'll be... Well, I'm still ramping up. So I'm just kind of understanding the scope of the TOC, making sure we can keep these meetings super focused and maybe focus a little bit on the outcomes. It looks like the last couple of years have been defining the TOC and what it should be doing and maybe now just doubling down on that scope. Cool. Awesome. So in terms of agenda today, we'll go over a couple of things around, we just kicked off the container de-graduation vote that's happening live. We'll have more discussions around CNCF SIGs. We have a community presentation from the CII Best Practices Badging Project. So David Wheeler is here from that community who will be presenting a little bit later to answer any questions that folks have for a very kind of interesting project. And then we'll talk a little bit about the backlog. So feel free to go ahead, Taylor. Yep, container de-graduation vote. So just a reminder to everyone that this is going on. Right now there's the message where the votes are happening and the pull request for full details. So hopefully, we'll leave this open for about a week or so and hopefully come to a conclusion on that. So thank you to everyone who has voted so far. Moving on. KubeCon, CollinativeCon, yeah. Sponsorships are open. We just opened the maintainer track, sorry, the maintainer slots for KubeCon China. So if you are a maintainer on a CNCF project, you should have received the note to request your intro and deep dive sessions there. But other than that, we have three events, three big events this year that we're looking forward to. Next up. Yeah, and Dan reminded me that Friday is the deadline for the Shanghai CFP. So if you wanna get your talks in for Shanghai, please do it by Friday. So like I mentioned, Kelsey is a TOC chair and taken a rant, super excited about that. So next slide. Another kind of administrative for the TOC, we have term limits of this time around. And since we're trying to stagger terms instead of having six slots open up at once essentially. So we just did essentially a randomized selection of who gets two year terms and one year terms. So here are the results of that. So both Jean Mac and Brendan have two year terms and Alexis, Joe and Kelsey got the one year term slots. Next up. Another friendly reminder to folks, some folks have been reaching out about the annual report. We had a lot of feedback from the conferences in terms of trying to be more transparent about number attendees and users, vendors, satisfaction levels. So we published a conference transparency report for the last KubeCon we did in Seattle. So please take a look at both of these reports and give us any feedback that you have to kind of make these things better for folks. And just final kind of reminder for summer code. We have been participating for the last three years in CNCF and have an amazing set of success for these projects. We've had some interns from summer code actually become full-time maintainers for some of these projects and even end up getting full-time jobs at amazing companies that we have in our ecosystem. So please take the time to submit a project if you're a CNCF project, it's well worthwhile. Next up, all right, SIGS. So we've been talking a lot SIGS. I think Quinton has kind of been taken point on this along with Alexis. So Quinton, you wanna kind of give an update since I've seen the massive pull request activity last couple of hours. Sure, yeah, I can do that. I think most of the feedback has been addressed in what I hope is adequate for everyone. So most of the comments are actually incorporated into document. I think there are two outstanding areas that perhaps are worthy of additional discussion. We may not have time here to do justice to them, but the one is the question about whether actually a TOC member has a place role to play in these SIGS, the proposal is very clear that there is a co-chair, it's termed a non-executive co-chair on each of the SIGS who is a TOC member and who essentially represents the TOC's interests on that SIG. And there's been some comment to say, well, we don't need such a thing, but the SIG can talk to the TOC directly. So that I think is unresolved. My personal opinion is that it's not practical to have all of the SIGS talk to all of the members of the TOC all of the time and that we do need to channel the stuff through representatives, but there seems to be some disagreement on that. And the other one is about what I'll turn SIGS sprawl. I think there are many individuals and companies who would like to see their favorite area have its own SIG. And so there's been several different but similarish looking proposals to split SIGS up into smaller pieces. And if we take that to logical conclusion, we end up with a fairly significant number of SIGS. Not only the SIGS sprawl problem, but also one problem I've identified is that many of these SIG areas are actually, there's not necessarily clear agreement on where we should split them. If we should split them into pieces, exactly where are the seams along which they're logically split? So my personal opinion or my personal preference would be to start with a smaller number of larger SIGS and have one of their first items of business being to survey the lay of the land and figure out where to split that SIG if necessary. Currently we have six proposed SIGS. There are at least a couple of those where people have come back and said, oh, this should not be one SIG, it should be split in two. But it's not clear to me that we actually have agreement exactly where the split lies. So it seems like the SIG itself would be the best vehicle by which to define where the separation between two related SIGS is. So that's my brief summary. Does anyone want to add to that? I know Yuri, you had quite a few suggestions in the PR. I'm hoping that I addressed them all. Most of them I actually changed the PR to accommodate. In fact, all of them. Anyone else here who's made comments? It's done silence. Okay, that's my spiel for today. Yeah, I mean, Quinton, one idea for me is we essentially need to get this to a point where we'll have the TOC vote and then maybe start a little bit small with a pilot SIG. It's kind of my thought process here to actually get this moving. Totally agree. So my thinking is that it's in a vertebral form at the moment. I just wanted to highlight that there are two comments from the community that have explicitly not been addressed. And so the TOC needs to decide whether they are happy with those as is or whether they would like them changed. We could, yeah. I don't know if you want to call a vote now or get an indication of whether the TOC is ready to call a vote now? Yeah, I mean, maybe a bit of an indication if anyone on the TOC was actually strongly opposed to calling a vote. And then we could clean up the last little bits and then pick a pilot one or two SIGs to kind of start this and kind of go from there. Yeah. Having a vote sounds good to me. Anyone else? If not, we'll leave it open for a little bit for some final feedback and then we'll call a formal vote of the TOC and community and then hopefully by the next TOC meeting we're gonna try to get at least a pilot SIG being formed. Sounds good. Cool, all right. I think next up is maybe Wheeler or SIGs. Yeah, keep going, Taylor, yeah. All right, so we've been having a lot of discussion around improving graduation criteria for CNCF projects, expanding the scope of maybe mandating security audits or potentially requiring different levels of CII, best practices badging and so on. So we thought it would be good to have one of the main authors of the kind of best practices project discuss a little bit of how they work, how you could share improvements with them, methodologies and so on since there's been a lot of kind of generally positive feedback about those practices projects. So I will hand it off to David Wheeler to introduce himself and talk a little bit about the project before we open it up for feedback from everyone, okay? Are you there, David? Fantastic, absolutely. Can you hear me? Yeah, I hear you, that's all you know. Excellent, all right. I'm not sure that I can control the slides, so I think someone else is pushing the forward and back button. Excellent, thank you. All right, so I'll try to say next when it's supposed to move on and resist the urge to touch the keyboard. All right, so David Wheeler, I'm the lead of the CII Best Practices Badge project. It's basically a sibling, it's another project from the Lenox Foundation under the Coordination Infrastructure Initiative. I think the key reason that I'm being asked to talk with you guys today and tell me if I'm wrong is, currently it's already required to get a graduation status to get a passing badge. We actually do have two higher levels, silver and gold and it is possible to change their criteria though that it's not easy and there's reasons for that. So I think the goal today is to try to find out is what you're doing, the best that you can do for yourselves, for your decisions or do you wanna make a different decision strategically? Is that correct, not correct, Chris? Correct and also I think there are some folks that may not be familiar with the project so I wouldn't assume full blown knowledge of the project since most of our projects have gone through the process but not everyone in this call actually does have kind of a wider community on it. Okay, fantastic. I didn't put as much about that but I'll try to verbally say some things and if you guys have questions, please just pipe up. I think that's kind of the point of the call today is to have that discussion. So just a level set here, the underlying premise of the badging system is that open source software tends to be more secure, higher quality, if you follow good practices, things like, do you use version control? Do you have automated tests? And we basically went around to try to identify those best practices and I wanna emphasize, this is for production of open source software. We're not really focusing on ingest. Obviously that matters too. We very much based the practice list on existing well run practices with the overall goal of making it more likely have higher quality, more secure software. Key thing to note is that the CI best practices badges intentionally designed to work for any open source project and we'll talk a little bit about how we do that but the goal is to have a general purpose set of criteria. The main websites listed there, if you go there, you can learn lots more. Next please. So I wanted to show some quick stats, just a screenshot earlier. There's over 2,000 projects that are currently pursuing badges and there's over 200 that have actually achieved a badge and a large number of projects that have achieved various states towards getting, at least the passing badge. And so you can see over time, there's been this steady growth of participating projects and not just projects joining but projects working to get a badge. Next. There are actually three badging levels, as I mentioned earlier. The passing is the one that we've really been focusing on telling folks about but we also have the silver and gold level. I do want to emphasize that any level's an achievement. So merely getting a passing badge, the word merely is really misleading. For a lot of projects, they find that it's a real achievement. For each of the higher levels, you have to achieve the previous levels. That's just for simplicity's sake. And I mentioned this earlier, we're really focused on stuff that real projects do. We're totally uninterested in criteria that, you know, some academic somewhere thinks it's a good idea but no one actually does that. Not interested. The passing badge was an attempt to capture what typical well-run projects, what well-run projects typically already do. That said, we found that a lot of projects find they're missing something. When you gather a whole bunch of criteria, any one of them is done by most well-run projects. But when you collect them all together, people find, oh, I'm missing one, I'm missing the other. After we identified the criteria, we grouped them into six categories. And I think that helps give you the sense of the kinds of criteria there are. Some basics, you know, what licenses and so on. Change control, reporting, quality, security. Silver is harder than passing, but we did design it so it's possible to do it for even single person projects. Gold really does require multiple people. You know, some of those criteria are great for users, but you have to have multiple project, multiple people on the project. I don't think that's such a big deal for a CNCF, but that is a big deal for some other organizations. Next. So I can talk a little quite a bit about the criteria, but I also wanna talk about what the criteria don't require. And this is by intent. First of all, we never require any specific technology, product service don't require, don't forbid any particular programming language. We never require proprietary software. You certainly can use it, you can depend on it. We design it so that those who do not want to depend on it can still get a badge. Doesn't cost anything to get them. The goal is very much not to take over projects. It's, you know, here are the things you should be doing. You can decide how you want to achieve that. And we absolutely don't require doing everything immediately. Some projects find they're doing everything, they get a badge essentially immediately. But some projects say, hey, I wanna get badge and they find out, well, whoops, I'm missing something. You'd be surprised that there are still projects out there that don't have automated tests. I guess some of you quietly, I'll say shame on you, but okay, they'll have to go make an automated test suite. Another common thing that's missing is one of those super easy things to fix is tell everybody how to report vulnerabilities. You can do that with one sentence in a read me, but there's still a lot of projects that don't tell everybody how to report vulnerabilities. And for some projects that turns out to be a big problem and creates lots of challenges for researchers who are trying to report the vulnerability, they first have to figure out how to impact the process of the thing and that's the wrong time to figure that out. So a lot of projects, yes, they get close and then they gradually work off the rest. Next, okay. I didn't have a list of the sample criteria for passing since you guys are actually already requiring that for graduation, but I think I've already handed it some automated testing is one, version control unsurprisingly is one, tell everybody how to report vulnerability. It's those kinds of things. And the goal was really, most people when they look at that criteria, they don't necessarily say my project does it, but they do say generally, oh yeah, I should be doing those things. And that's kind of the response that we're trying to get. You may have not thought of this, but once you read it, you go, oh yes, that's something I should be doing. The silver criteria build on that, you have to get the passing criteria to, before you can try, before you can get silver. For silver, contribution requirements, you must tell everyone what the requirements are for acceptable contributions, coding standards, whatever. You have to have a report tracker. You have to use some sort of static analysis tool to analyze your source code. Assuming that there's at least one open source tool that does that. Next, oops, do we not have a next? Oh, too many. Okay, another one is if your software is being produced in a memory unsafe language like C and C++, then you have to use a dynamic tool. We don't know, we're not saying you can't use C and C++, but there are certain common problems that are really common in something implement C and C++. And so while dynamic analysis is great for everybody, you have to use those for those languages to counter some of the problems that can otherwise creep in. Another requirement, which a lot of folks are already doing, but many others are not, is there should be some legal mechanism to assert that in fact the contributions are legal. And the easy way to do that is a developer certificate of origin, DCO. Linux Kernel does this, lots of other projects do it. It's a lower effort ceremony, but it helps keep things, some legal problems away. The project should have what's called a bus factor. You should have at least two people. One person gets killed by the metaphorical bus. There's somebody who could take that over. We're only making that a should. We move that to a must later on goal. Next, governance. And of course, given this particular body, you're already very familiar with the issues of governance, but some projects don't tell everybody, here's how we govern the project. Here's how we make decisions. What are the key roles? And for Silver, we say, you've got to tell people how that happens. We don't mandate a particular process or governance model, but we do say you have to tell everybody what it is. Documentation for security. You've got to document for the user what they can and can't expect for security. Different projects have different expectations. And we're trying to be flexible, but we do think that everybody, every user should know what the project is trying to achieve and why they think they're achieving that. One requirement that's gotten a lot of discussion in the process of creating these criteria was a test coverage. The passing badge requirement only requires that you have a test suite and that you have in general agreed that you're going to improve it as your test suite as you improve, as you add decade abilities. But this requirement at the Silver level says, hey, your tests have to cover at least 80% of your code. Now, many people are familiar with test processes and test coverage can easily tell me, well, you can have that coverage and have a lousy test suite. Absolutely true. But if you don't cover most of the code, for sure you have a lousy test suite. So this at least gets rid of the pikers who, yeah, I've got a test suite, but it doesn't really cover a lot of the code. And no, that's not all that great. You really need to, a higher bar is better. Next. All right, so what I'm trying to do right now is give a sample of what these criteria look like. For the gold, a bus factor is now a must. You absolutely must have two or more folks if somebody gets killed. We hope not, but it can happen. We're all mortal. Somebody should be able to take over. At this point, we'll have another requirement, which is even stronger. They have to be, there has to be at least two unassociated significant contributors. By unassociated, we're talking about corporate association. If everybody's on a single company and the company decides, you know, I'm not interested in investing anymore, projects sometimes can fold pretty quickly if it's supported by a single company. This is something of course the CNC, you guys are painfully familiar with. So we're basically saying, hey, you've got to have at least two different organizations involved. Requiring 2FA, this is basically making sure that you're using 2FA for changes to some central repo or sensitive data. Two person review, lots of debate about how much review was necessary. A lot of folks ideally would have 100%. That seems to be a challenge for a lot of projects. So at least half of the, all the changes get reviewed by someone else. Next, reproducible builds. I'm a big fan of reproducible builds. I encourage them wherever they go. Some people have trouble with doing reproducible builds as you've been following the work of Debian. You know, they've got some amazing work that's going on, but it's not necessarily actually deployed out in their system. And that's one of the more, that's one of the organizations that's working especially hard at this. So at a gold level, say you've got to do that. The test coverage for statement goes up to 90% and branch coverage goes to 80% for your test suite. Again, the goal here is we want people to have good automated test suites. This is a way to help ensure that in a way that you can quantitatively measure. Security review, I don't notice a lot of talk about this. The way the CII badging requirements have that is you have to have a security review in the last five years. And that has to consider both the requirements and the security boundary. I'm sure we're about to have discussions about that and that's great. Next up. All right, can the criteria be changed? Yeah, actually we expect that they be changed over time. That said, changing the criteria can have a big effect on projects that are participating. Our goal is to not change criteria so much that nobody's interested in trying to achieve a badge. The badge of course is just an outward symbol. The real goal is to get projects to do good things and for users to be able to know that those projects are doing with good things. So for projects has honestly received a badge. They have a right to not have that just taken away from them without notice. So we do change criteria immediately, stuff like spelling corrections and grammar corrections and stuff like that. If there's a clarification that was expected but maybe was not clearly done, we at least make sure that there's an opportunity to review of those changes. But let's say we want to do more significant change like a whole new criteria. Those we expect to have less often and projects have to be given time to either object or modify their project to comply with that. At this point, we've generally expected badge criteria are gonna change at most once a year and we say projects have to have at least two months warning of the new criteria and opportunity to keep the badge. We actually have a system called future criteria where we can identify that basically here's a criteria that's going to take place and you can see it, you can review it, you can tell folks yesterday we compliant and you have time to make that change. In general, it's easier to change the silver and gold criteria simply because there's far fewer silver and gold projects and they're already expecting that that's going to be much harder. That's not saying that we can't change the passing criteria, we certainly can and have but we want to do this in a way that is not harsh on the projects that are already participating. Details on our own governance structure and the criteria changing processes there and you are on the bottom. Next, yeah, and this is my last slide anyway. All right, so CNCF graduation status currently already requires getting a CII best practices badge at the passing level. You know, you folks can make the decision. You know, do you want to stay with passing? You could switch that to silver or to gold for graduation depending on what your objectives are. You can change your criteria. Obviously it's a big deal because we don't want to do that just haphazardly. We want those changes and those changes have to really apply to all projects, not just CNCF but you know, if you've got a great idea and something that really should be there, fantastic, we'd love to hear it. And of course you can already do what you're doing right now which is we'll build on the CII best practices badge, pick a level and then add specific criteria that you think are important for your project. That's all I've got in terms of a little presentation. Love to help, love to answer questions, love to be engaged, whatever you'd like. And Chris has returned, hi. Hey, no, there was a couple of questions in the chat. I don't know if you've... No, actually I didn't turn on the chat. I have trouble following chat and talking at the same time, chewing gum and so on. So yeah, I thought people would be jumping voice if they were gonna do that. So okay, let's see. How do I turn on chat on this crazy thing? No worries, I'll just go. Someone asked how much of the CII badging is objective versus subjective? Oh, of course now you've got to define subjective and it's subjective, right? Yeah, how far down in this rat hole can we go? The goal was to be as objective as we could. There are definitely ones that are subjective. And some of those really, it's not even clear how you would make them subjective. But as far as we could, I would say the majority of them are objective. Some are subjective, but it's okay simply making the attempt is a good thing. I think my favorite subjective one is you have to describe on the website briefly what exactly is your project. I have no idea how to automate. Here's a sentence or two is that clear. I have no idea, but for that one, what I say is you make the attempt and you're good. Because for most people it's the you may, there are actually a number of projects that's less common thankfully now, but there are projects where they immediately get into the here's how I was built and so on. But what do you do? Tell me that. And most of the other ones, things like for example, the test coverage, 80%. It's a very simple measure. There are lots of tools out there that can measure that. It's not subjective at all, it's quite objective. And we really strive for that through all the criteria. Okay? Cool, thanks. So one thing we're trying to figure out is we're refining the graduation requirements for CNCF projects and discussion for the TOC to figure out, do we keep the same? Do we kind of raise the bar by maybe moving to a higher level in the CI thing to like silver or gold or do we just modify the requirements we have and put our own kind of twist on it with having kind of the base passing badge? So I don't know if there's any strong feelings on the TOC that kind of wanna bring this up. Matt, if you wanna speak to this, Brian, there's people chatting. Sure, I mean, there's a couple of things that I would love to see us actually require. I feel less strongly about whether we keep passing and add stuff on top like we do today or we talk about actually changing those requirements. So I guess I'm wondering, is it worth it to talk briefly about some of the things that we think are missing on the current passing level and then we can discuss whether we think it's worth working that into silver or gold or just adding things on top? Yeah, absolutely, love it. I have some opinions about that. But I had a question first, are dependencies of a project treated differently than the source code generated by a project? They are, and this is because of the challenging reality that most projects depend on a massive number, particularly when you consider the transitive dependencies, which you really should, because the transitive dependencies matter too. Okay, good, that's what I would expect, but for that reason, I think it's important for us to, if we can, really try to encourage dependencies to meet the core infrastructure initiatives bar as well. Otherwise, dependencies will be the weakest link of our projects. So I would prefer to make sure that the CII passing levels, you thought were satisfactory as opposed to trading a bunch of our own special requirements. Right, one thing that has been mooted, if I can respond real quick to that, one thing that's mooted is basically saying that you're maybe at gold someday, we can require that all of your direct dependencies have themselves a badge. And although that's only the direct, of course, they themselves would start working on it, and that would encourage folks all the way down. At this point, we haven't done that because of the simple chicken and egg problem. It's hard to require everybody get a badge when we first started, of course we only had a hundred projects were working or achieving a badge, working on a badge. So we definitely would like to increase more and more requirements on dependencies because obviously that's where a lot of the problems are today, but it is challenging right now to do that. But we can certainly at least require, you track your dependencies better and then on the longer term, I'd love to see, hey, let's start dealing with the trains of dependencies to make sure they're doing well too. The JavaScript infrastructure in particular is where that's really a challenge, where people create little modules that maybe only a couple of lines long and so you end up with the huge set of things you need to deal with. We have thought about how to deal with that, but that's where we are right now. Oops, I heard something. Yeah, no worries. Can I have Matt Farina and then Matt? I just had two quick thoughts on the transit of dependencies because that's hard. So Go is not as bad as JavaScript with the so many dependencies, but it's not small either, right? Go look at Kubernetes dependency list. Well, I like the idea of getting all the transit of dependencies there. I would suggest maybe that be put as a long-term goal out there and saying, how do we help, you know, care it projects to get there? And this might be a great place to go pair with GitHub because I know GitHub has these recommendations for projects, you know, include a license file and things like that. If we could take the basic badging level and see if they're willing to do some evaluation and guidance on this for projects, this might be an easy way to start caretting more projects that way. Because I would love for some of the dependencies to start going down this path. And if we could partner with them, that might just kind of put more of these caretts out there. Just a thought I wanted to pass on while we were in the moment because I like the idea. I just don't think it's practical in the near future, but I would like it someday. Right. And this was our conclusion as well. We would love to mandate more dependencies, but that gets challenging for all the reasons you've just listed. Cool. And I think Matt Klein had something to say. Yeah. Yeah. So on the depths, it would be super awesome for us to badge all of them, but I also don't think it's actually reasonable. I mean, I think projects will have potentially tens or 50 or hundreds of depths. I do think that we should potentially invest in tooling to have some type of dashboard to see what dependencies projects have. And then maybe we could have some indication of whether those dependencies have been batched. So I think that would be awesome. On the actual things that we should require, the two things that I would like to see, and I actually put this in the TOC issue, but there's two main things that I would like to see and I'm not sure how they will fit in. My first thing would be some level of code coverage. So whether we take that from silver or gold, I think some basic level of 80% or 90% would be nice. I totally agree that having a code coverage does not mean that the test suite is good, but if you don't have the code coverage, it's guaranteed to be bad. The one that I feel even more strongly about though is that I feel that we need to require a 100% of changes to be code reviewed. So that's the thing that I'm most concerned about is that the gold level only stipulates 50%. And I feel so strongly that any graduated project at CNCF must have 100% code review. And at least speaking from the Envoy perspective, we go further and say that any non-trivial PR has to have a code review for multiple orgs and that prevents one organization from actually pushing changes through. So that's just something that I feel so strongly about that I would like to figure out how to actually get that in. Well, let's see, let me make a couple of responses and I'm sure other folks are gonna wanna jump in too. First of all, it's a little awkward place because I am actually not against 100% code review. I love it, I think that's great stuff. We did get a lot of pushback from various projects who said basically that that was for them not particularly practical. I'd have to go back and drag in some of the things. But that was one of the challenges is when you're looking at it broadly across lots of different projects, different projects are in different kinds of situations. Obviously, if you wanted to add a specific code coverage requirement or a code review to a requirement to CNCF, you guys do that right now. You add specific requirements and on you go. And by the way, there is actually, there is a two-way street here. I mentioned earlier, one of the main causes for something to be in the criteria is that there's evidence that projects are already doing it. The more projects that we can point to that are doing 100% code review, it makes it easier for us to add that as a requirement. So you can actually work the other way. If CNCF says, hey, to be graduated, you have to achieve 100% code review. Fantastic. You can just do that yourself. And then we can use that as an argument with other projects to say, hey, it's time to step this requirement up. I had a quick question around this idea of being able to change the requirements. I can fully appreciate the backwards compatibility concerns and not wanting to change things under projects, but I can equally think that there's gonna be, if you extrapolate this thing out a few years, there gonna be a lot of things that we're gonna want to improve upon over time. And we don't want to be restricted in our ability to do that. Have you considered something like a year-dated compliance that says, we got the 2018 gold badge or whatever and every other year, maybe the gold badge changes and we're not compliant with the 2020 regulations, but we did get the 2018 ones or something like that. That is actually the expectation. As I mentioned, we really don't want to change more than once a year and we want to identify the requirement set when we need to diversion them basically by year. That's exactly what we're thinking. That said, right now, there are a lot of projects which are even struggling to meet the passing criteria. There's a sad statement. Although it's really much struggling, maybe a little misleading. What they find is that they're doing many of them, but yet there are many other things they're not doing and some of them are kind of remarkable. You're not using HTTPS, shame on you, get with the program. You don't have automated tests, shame on you, get with the program. But because you've got projects that are still in that level, there's a trade-off of if you add the requirements and make them too difficult, they decide not to participate at all and I would rather get half a loaf than no loaf at all. And so that's, which is why it's a lot easier to make changes to the silver or gold because at that point, you've already committed to working much harder to do well and there are unfortunately a lot of projects which are in my mind, not even doing the basics and not doing the foundational things. Yeah, I'm trying to now follow the chat and yes, Kubernetes is already at passing, so is Prometheus. And I believe, I'm pretty sure Container D is also, they've all achieved passing badges. Yeah, and in fact, the CNCF landscape actually does include information on the badges already. The web application actually makes it really easy to make requests about projects and such. So the landscape already shows information like that. Alrighty, now I gotta talk and read at the same time. Yeah, any other questions? I mean, I think concrete next steps is to have feedback on the GitHub issue, that MacCline link, and then I think from there, we could distill potentially a couple of requirements to add to the graduation criteria and decide whether it makes sense to, essentially push it upstream to CII badging, which seems like it may take a little bit longer. I think that's actually a good process in general. As I said, if you add them, I'm sorry to interrupt here, but that makes it a lot easier for us to argue that projects really are doing that. Nope, and to answer most question, will graduated projects need to go back and meet their requirements? No, but that's essentially TOC discretion. I would say that we would hope that you would. Each project can make its own decisions, but I would think that they, if we add it, we're only gonna add it if it's actually, there's good evidence that's worthwhile. So if we're not, frankly, we wanna go right away if we've made a mistake, that's one thing. And of course, if the project's doing something badly, you would wanna know that too. Yeah. Yeah, I was gonna add a comment along those lines. I think it's very valuable to have projects disclose where they stand with regards to these requirements, even if we don't require them to meet any particular standard, just declaring unequivocally what their status is with respect to some of these requirements is super valuable. Yeah, right. I'm gonna have CNCF kind of do an audit of all of our projects and figure out where they fit essentially on the golden and silver ladder. So. Yeah, if they haven't attempted to get silver, they may actually have met more criteria than we know of. I haven't really talked about, but we do have a website, a web app that basically lets people fill in information to the extent that we, what we can, we automate filling in the criteria. Some we really do need humans to tell us, and then we record that so you can find out where various projects are. If they're not meeting that something, then the obvious next question is, well, why? And that would be useful for many reasons. Any other questions for David while we have them online? If not, I'll encourage everyone to move the discussion to that GitHub issue. That was linked by Matt and also in the TOC channel on Slack. And hopefully by the next TOC meeting, we could distill a couple of concrete additions to the criteria and have a vote and kind of go from there. Chris, may I interrupt for just a second? Sure, absolutely. Sure. So absolutely, please do contribute there. If there's something specific to the badging that's not really related to CNCF, please file an issue on GitHub. We like those. So we'd love to hear from you in whatever way you can. Awesome. Thanks, David. Appreciate you taking the time to present this awesome initiative. Love it. Thank you so very much for your time. So we have about 10 minutes left and pretty much wraps up this meeting for this time. I'd kind of leave it open if anyone has anything else they wanna discuss if there's someone on the TOC that would like to bring something up. Otherwise, we'll get 10 minutes back in our day. I will accept silence as we're good to go and we're all gonna get about 10 minutes back in our day. So thanks everyone for taking the time to attend and look forward to having the discussion on improving the graduation criteria on GitHub. So thank you very much. Thank you. Thank you very much, everybody. Thank you, everyone.