 Hello. Hey, what's it going? Good. It's been fun reflecting on the last year and a half or so. Hi, Emily and Paul. Brandon, do you have Justin's slide set you're gonna present? Yes, Sarah. I'm actually gonna present them. Awesome. And I put them in, I copied it and pasted the template, which is a little different what you all have been doing for a slide. So you might have to update the little blurb I put in there, but I pasted in a link to the slides and then I can also present. That's good. And what I'm gonna try to keep it to like 20 minutes or less and just zip through things and then hold questions because I don't know what people are gonna want more and less about. And, you know, except for like, I mean, Brandon, please interrupt me if I get anything wrong or leave anything important out because I took Justin's slides and then added some links and pictures. Diagrams. I'm big. I love the diagrams, the colors. That's great. I'm gonna wait about another three minutes for folks to jump on and mark per request. I am on camera today. Teehee, I was getting so I could draw that photo of you myself. Okay, quick reminder, we do need scribes for today. If there's anybody that can volunteer, go ahead and write yourself into the scribe section of the meeting notes document, which I will paste in chat. Okay, so for everyone, make sure that you add yourself to the attendance list. If you're a new member, or if you have an update, provide your name and we will do a quick round robin on updates before we get to our presentation. Let's go ahead and get started then. Ash, do you have an update? Yeah, I just had a quick update on the key group assessment. So the PRs open for reviews and suggestions. So if you're all are interested in adding your comments, please do so. I'm gonna paste the PR link in the chat for those interested. And yeah, feel free to add your comments. And yeah, that's pretty much it. That's my update. Thanks Brandon for volunteering in Scribe. Who else? I have a reminder for everybody to submit CFPs for our cognitive security day. North America 2020. Links are available in the past meeting notes. Who's next? Brandon, no updates. No other updates from anybody? Any new members? We have a lot of numbers following it. Well, I guess we can get started with the presentation. Sarah, Sarah, you're still muted. Can you see a perspective on security assessments? Yep. All right. So Justin Capos prepared most of these slides, but it's one o'clock in the morning in Shanghai. So I added a little bit of color and links and diagrams and my thoughts to the slides and volunteered to give the presentation. Brandon has also been part of the team from the beginning and pre-reviewed the slides that might chime in. So security, so this is supposed to sort of set everybody's context about what the heck these security assessments were intended to be and where we are now and then there'll be a discussion which Emily will facilitate. So this is from the README. Security assessments are a collaborative process for the benefit of the cloud-native projects themselves and prospective users by creating consistent overview of the project and its risk profile. Justin is the security assessment facilitator, which means he runs the process. I am a co-chair for each ongoing project at Six Security. We have a co-chair representative so that I'm responsible for like communicating to the TOC and the outside world who has any concerns about this and escalating things up or down, I don't wanna really say up or down, but across to the broader organization and to the review team and vice versa. I'm also responsible for reviewing every, for generally being the go-to chair for review of security assessments, although JJ has done a couple either when I served as a reviewer or was on a leave of absence this summer. So the history of the process, in early 2019 we were getting requests from the TOC for evaluating projects, primarily ones that had not yet come to the CNCF and Justin Capos who had done this as a TOC contributor previously for, he'd actually done it, I think for spitting inspire maybe another. He did a first cut at like what our process could be that any number of people could follow. And so we generally thought this was a good idea, we liked his process, but the PR got into the weeds for many months despite lots of effort. And so to break the deadlock, Justin came up with this idea of like, let's just do something that is the sort of least possible thing that we think could work while there were many ideas that would be improvements to that, we put those on the table and that we would do five reviews and then review what works and what doesn't. So we merged the PR in early May with a number of open issues and I linked to the ones that were open at that time in case anybody's curious. So, and then in May, Justin reviewed what we were doing at the KubeCon session and then up now we have four completed and a couple in progress. So even though we haven't finished five, we think it's long overdue to like, you know, do our first five process improvement thing. So this is the sort of very like cheat sheet about the process. We identified the team, both project lead and SIG security has a few reviewers. The project lead provides a draft document or we call the self-evaluation and there's an outline for that. There's a naive question phase where the lead security reviewer or their delegate asks clarifying questions so that, you know, we have a complete artifact. And then the security review team does a close read asks questions, typically comments in the Google doc request revisions, usually like more detail on this item confused by that. And there's a back and forth with the project lead. And then there's a presentation and discussion at the SIG security meeting and usually some further revision and we do a summary assessment where it takes a bit of work to get on one slide and overview of this thing that seems to have value to the TOC because for every one of these we've actually done a presentation and that seems to go over well. So when we, to get unstuck, we all met together at Docker to con in 2019 where most of us happened to be there who were kind of scrubbing in on this process. And we came up with this timeline. So one of Justin's key concerns which I thought was really good is that we should have like, these things shouldn't stretch on forever. We should be able to, if somebody signs up as a reviewer we should be able to say it's gonna take this much calendar time and this many hours to get this done. And maybe it's a little more at the beginning but we should be able to get a reproducible process. So we realized later that dumb questions was an inappropriate way to phrase this. But the idea being we spent a couple of days just asking these naive questions of like, wait, what does this project do again? And sometimes we might personally know but we'd make sure that the artifact contained information about what the project exactly is and what it doesn't do. And then get the security view from everybody to iterate and do the report. And then I wanna go into detail in this little picture at the bottom because that was one of the things where we were like, well, if we only have like a really rough outline of the process, how are we gonna have consistency across the reviews? How are we gonna like make this work without like incredibly detailed documentation that we can't agree on? So what we decided to do is we would start with the four of us who were then Justin Cormack who's also on the call. Hi, Justin. Justin Capos who was actually, we queuesed himself from the very first security assessment because it was in Toto which is he's a contributor to. And then Brandon Lum and myself were the team. And then the idea being all these little dots were at every step, like the rows are different security reviews and we would ensure that at least one person was consistent from a prior review. So we could get to the point where a future requirement of security assessments is that you've participated in a prior assessment but how do you bootstrap that? So one of the things that I'm super excited about is we did a really good job with this, I think. And I'm of course speaking from perspective of having participated in almost every review. So I'm really interested in hearing other people's thoughts on how it went for them. So you can see in the darker colors are the first four of us and how we participate in different reviews. And then we had sort of a challenge as we broadened the number of reviews and we had multiple reviews kind of starting at the same time. And I remembered that Robert was participating in an earlier review but he might've been shadowing but in any case we did do a good job of having consistencies and we decided that if you were a project lead you really got the review process and then you could lead a review and it's exciting. The Ash who I think is also here maybe here today. Yes, I heard you announce yourself is leading a review now. So I think that like what we had written down at the beginning is that it would be ideal that somebody from a project who had participated in receiving a review then be a security reviewer for another review. So they have the sort of inside and outside perspective. So I'm gonna go through these quickly just and then we can come back for any questions or points of discussion. So then Justin, I'll just go over this according to what I've heard from Justin and what's written on the slide. So Justin's reporting what seems to be working is that the completed assessments are valuable to the projects. It scales fairly well and it seems like the review it's not too burdensome for the reviewers and both the project and the assessors the security reviewers get to explain issues in their own words so that you get the sort of perspective on security from different voices. What's not working so well is project and reviewers tend to drop in and drop out. He has a question about TOC requirements which I'll address later in the presentation and assessments vary in the level of detail and the type. So some things go more into threat modeling some have more user guidance review there's been some that have had a lightweight code on it. So there's like variability which he saw as something that's not working and then I'm not sure that's a problem but that's just me. But my view, I generally plus one to all Justin's comments on what's working. I've also participated in hearing from the TOC on each one of these and gotten positive feedback both during the meetings where these things are presented and offline people have mentioned to me. I find that the consistent assessment outline means that doing a second or third review as a security reviewer makes it much easier to like just go through and do the review. Also when I go back and think, oh my gosh what was that project exactly? How would I describe it with somebody? Like it makes reference really easy because I can go right to the section. I don't know whether it'll be interesting to see whether other people read many reviews besides the co-chairs of security. But one of the things that's working is we've had, at the very beginning we're like, well people really wanna do this. You know, will this be a burden? People do this for their jobs. Will they wanna volunteer? So there's been a nice number of interested and qualified reviewers. So that's been good to see. And then the project board I'll talk about in a minute I think has really worked for me in terms of helping that communication outward and inward. So not working so much. The lack of clarity between the security assessment and what the CNC do diligence. Some of that has been their process has evolved. So we've kind of had to absorb a little bit of evolution there. That hasn't always felt so great to the projects but we work on it. And I'll talk about that more in a bit. And then the lack of consistent execution. Some of the, some of times it's individual people dropping in and out, as Justin mentioned. And sometimes there's been process confusion. One person's waiting for the other person to do something and we're just like, wait, it's been three weeks, what's going on? Or a month and a half. So I think there's been some improvements and there could be some more. So I wanted to highlight, go ahead. Kind of just since one of it is not really important was that like there was like a different assessments kind of looked a little bit different. I think I just want to point out it's not that some assessments were lacking in certain areas, it's just that some projects kind of already had prior work done in certain areas. So they just like had a much more expensive amount of content there. So I think it's kind of like every, just want to point out that every assessment kind of met the bind, which we, what we wanted to get our assessments is just that some of them went a bit further to kind of show a little bit more. That's a good point. Thanks, Fred. So I just want to point out for people who haven't been here since the very beginning or maybe have forgotten because it's been so long, what we've done, even though we said we're going to hold all the process improvements to the end, certain things that we're getting in our way we fixed. So the idea of having one of the reviewers typically the security review lead, do review the self-assessment and really go through it in detail has really, I think helped because one is like you get this sort of unfiltered, unbiased everybody reads the same thing for the review team. And it's also easier on people's time. The security assessment queue that I mentioned I put a little picture of it here. Having a blocked column makes it clear which things are in the backlog, meaning we don't have the bandwidth to do them for whatever reason, because we've been trying to only have one in process at a time, but we sort of relaxed that lately. And which ones are blocked because they're missing a component. And typically this has been the self-assessment which wasn't clear at the beginning, right? It wasn't clear that that would be the time consuming bit. And I think there's process improvements we could do for that as well. And then there's a done column which I didn't illustrate because it's less exciting although I'm super excited about it. And then the other thing is we added was the intake process, which was kind of started before this assessment queue because people were like, well, I don't wanna do myself eval if I'm gonna be like sitting there waiting, and what's gonna decide? What if we have two projects ready at the same time and only one review team? Again, this is from when we felt like the review team was the scarce resource. And which I think things are a little different now, but it was a healthy process. We talked it over with our TOC liaisons, then Joe Beta and Liz Rice. And with the different security reviewers. And then we presented this to the TOC writ large and we sort of adopted this process which it's pretty straightforward. It's just prioritization. Mostly we do it on our own, but if at any point the TOC is like, we really need this project to be assessed, they can preempt that and add it in the queue. So the idea is that the TOC won't interrupt a review in progress, but they can bump something up in the backlog. So far it hasn't happened, but that's how they say in government, we serve at the pleasure of the TOC. At least we chairs. So the other thing we improved is the conflict of interest guideline. We saw that in, you know, there was a case where a whole project was stalled because we were like, is this a conflict of interest? And, you know, should somebody recuse themselves? And then it was like, you know, like we don't need waiting, we'll be commits about it while other people, you know, other things would move forward without discussion of something that was maybe as significant. So I wrote up something at the TOC level. Brendan did a nice PR. We actually still have an open issue where things need to be clarified. I noticed some formatting errors this morning. So some of these improvements are very much still in process, but again, there. And then the big question is, how does this fit in with the security assessment stages? So in the last, I don't know, about six months ago, the TOC did some work on really clarifying these stages. There's a process doc about how projects go from sandbox to incubation to graduation. So this was always documented, but there was a lot of confusion. And so there's this due diligence that the technical oversight committee is responsible for like looking at the project and saying, should it be part of the CNCF? And so what they're communicating here is that the majority of their due diligence, the really thorough review happens before incubation. And there's a lower bar for sandbox. And this is a clarification that wasn't, or maybe it's not clear whether it seemed to be inconsistent for those of us who aren't in every TOC meeting. And so this clarified it for us a lot, which was great. But how does this relate to our assessments? Well, first, before we get to the assessments, the CNCF pays for a third-party security audit which is like some traditional sort of quality audit with like penetration testing and code inspection and like ticking all those boxes during incubation as a prerequisite to graduation. So that's important to know that that takes place and has taken places for half a dozen projects and there's an open issue. They're supposed to be listed in our repo and we've been working on finding them, collecting them. So what we've agreed on so far last year with Chris A. who does allow the operations for CNCF is that when a new project requests an audit, he will tell them that they need to do their security, a SIG security assessment first so that that ends up being a feeder for the audit. It should make it so that the money spent on the audit can be more efficient, right? Cause they can take that as an input and then it creates a little pipeline that's nice. And we specifically designed our assessments to not to like slightly overlap but not be redundant with a third-party security audit cause, you know, that wouldn't make any sense. So the sort of proposal that some of us have talked about is that we could make it so that the self-assessment is a prerequisite for incubation. So during sandbox phase, all projects are required to do a self-assessment. That becomes a feeder for the assessment which would then be something that is required for graduation because it's a precondition to the audit which is required for graduation. So that's the idea. We have told our, the TOC liaisons and team that we're not going to propose this until after we do our process improvements after our first five. So while we've talked about this, the reason it is an inked is because our precondition was five assessments and process improvements and then this and some other things, whatever we determined. So I think that we're like the chairs and the TLs basically are like, well, we're close enough. We've completed four, we have two in progress. It's long overdue. We thought we would have done five by now. So we should dive in. So that's the end. I think that's- Yeah, another thing that I'd like to kind of talk add on here is so for kind of the significant barrier as over that, that is kind of a second recommendation that comes as part of it. And so in the repo for the process, they have kind of a question that kind of what are the details that they're looking, the TOC is looking for to the six to about the project. And we were looking through the question there when we were creating the self-assessment as well and also trying to see how we can map onto those details. So that it's not only a good document for assessment, but also we'll cover kind of like the main points that the TOC is looking for in the project. Although at the moment, that's only for projects that are labeled as security projects, specifically not for other projects that should potentially be doing an assessment at the moment. There's not everything's being referred to sick security for incubation due diligence at the moment. Yeah, that's a good point, Justin. We have projects are basic for everybody who haven't been in the weeds here. The every project is kind of categorized with one sick. And so sometimes more than one. Oh, OK. Yeah. So we get sort of like assigned by the TOC. Here are the projects that you use sick security, have some kind of oversight participation with. And most of them are projects that serve a security purpose. What we assess what the assessments call the security providers, right? They provide some key aspect of security. And so one of the sort of open like we have prioritized those projects for assessments, both in terms of our outreach and in terms of what we focus on. But we do think that we've said that we want the assessments to be done, ideally, by all the projects because all projects have a security aspect, right? It just might be, we haven't had that experience yet. So it might be a little different. Maybe it's going to have a different template. Maybe it's not. Maybe it's just going to be different shape. We don't know. So I guess it'll be an open question whether the self-assessment will be required or maybe that's an incubation or going to figure it out. Thanks, Justin. Shall I move on to Justin Kapos's final thoughts? So Justin Kapos, not present, says, he's had a heavily weighted point voice on how getting here, which has been wonderful, right? Because he's driven us to actually do stuff. And many people have this experience. So he encourages SIG Security to form a small subgroup and actually do this formal revision of the process or propose revisions. And he can be available for questions and opinions and to participate if we do this at some, not in the middle of the night, Shanghai time. But he will commit to at least participating asynchronously. And he's excited that we're at this point. So that's the little brief presentation. Thanks, Ann and Brandon and Justin Kormak and Justin Kapos, if you're watching this later, for the presentation. I think it did an excellent job capturing a lot of the historical knowledge in context that we just don't have anywhere in the repo and small instances where it does exist would be very hard to find. So we wanted to kind of bring this up and present it to everybody because there's been a lot of questions about how we do these security evaluations or security assessments for the different projects that are presented to us. And we've gotten to the point, as Sarah had indicated, where we've done enough of them and it's gone on long enough that we feel like we have a much better understanding of where process improvements need to occur, whether or not we actually know what those improvements explicitly need to be. So given the amount of tickets that we have open on this particular topic, and there is one, two, three, four, six last I checked, starting with the original issue to do the process improvement, which is issue number 167, we want to kind of queue up a working group. Hopefully get volunteers from the community. And this is a great way to get involved if this is your first opportunity to start contributing to the SIG. To kind of look at the way that we've done things and the processes that we've had, and how can we improve them and make them better, not only for members of the SIG, but also for the projects within the CNCF that are moving through these different cycles. As Sarah had said in the slide deck, we talked an awful lot about self-assessments being at pre-incubation, some point at the sandboxing stage, which would be ideal and helpful for a lot of projects, because that self-assessment, it kind of gets them in more of the security development mindset about what it is that their project is doing and how does it fit from a security perspective within the rest of the CNCF, as well as focusing on their own development practices. So that's kind of like the background for why we're having this conversation now and what the needs of the SIG actually are. So I want to kind of open it up to any questions that folks may have for the TLs or the co-chairs or anybody else with history on this. And then hopefully we can start kind of teasing out the working group in a little bit more of the direction for that to take. I can share the fresh experience from Kecoxides. So in general, it was good and worth doing. Like the outcome I think has a lot of value, even as a reference material for the project. Outside of getting the badge of getting the assessment. I think on a kind of like what could be improved, it has been time intensive on a project site. And I think I was wondering, is it a barrier for some of the project possibly? So it's about professional open source kind of like a non-payment open source. So we could kind of like divide in general projects as having companies sponsoring maintenance of projects, like people being paid to do extra on maintaining the community. Then kind of like projects with good community but more targeted contributions. So companies showing up contributing in a feature but not really stepping up to do something more. And then some projects which had like really non-paid, like all of the maintainers are doing it as their evening or kind of like being engaged outside of work. And like it wasn't totally like few days of real work to work on the self-assessment part. So for, I know some projects which has this kind of like not of maintainers being paid to work on a project and they have aspirations for CNCF. And for them, I guess it would be a hard barrier to work on something like this outside of the working hours and they could stop. So this and what was kind of like mentioned that it would be worth to have a kind of like timeframe. So it starts, it stops and like we don't iterate over and over a bit on it would be helpful. Okay, anyone else? So there's currently several issues. They are linked from the September 17th meeting which Brandon, I don't know if you've pulled them up yet but we'd like to try to get a group of volunteers to kind of work on refining what this process looks like. So we, you've heard from the key cloak assessment how that went. We have plenty of other videos on YouTube from other feedback sessions with other assessments that have been run. And we've also got, we've also discussed a little bit about changes to the process that we've thought about or kind of kicked around in the past. So I'm curious if there is anybody that is interested in joining our working group to help refine what this process is and what it can become. Yeah, and just to kind of add on top of that so I think what we were discussing is that if we realize that there's gonna be various aspects of ways you can improve the process whether it may be some of it is maybe redefining it. Some of it may be making sure that how to improve on the execution and from kind of this working group we may split up into slightly smaller groups to focus on whatever people are interested in. Yeah, I was thinking that like the first step might be to just triage things into like there's a bucket of things that are, this isn't clear. I don't understand this. This doesn't make sense to me, right? That are more like not this doesn't make sense to me but like clearly it's not documented well, right? In certain places. And then there's other issues which are like in conflict with each other. I have this idea which is directly different from this other idea. And so by separating those and maybe, you know first having like doing the detail second, right? Because if we radically change something then the details will change, right? And then we can like discuss in a small group what do we think, you know? And then if that group can get aligned or not, right? At least tease apart the big questions and then present back to this group and say okay we considered ABCDEF and we think D and F are good big changes or we considered all these things and actually we fundamentally think it's reasonable with these tweaks or whatever, right? And then go forth and as important a part of process is like having it all written down well so that people don't get confused in the future. Yeah, so I linked the issue 167 which is the source issue to do that first five improvements process. So if there and it looks like there's quite a few people that are interested in contributing to this which is excellent, happy to hear that. If you all can comment on the ticket that way we know who's interested in joining this up and we'll also post it in the six security channel as well for anybody that's not on the call today to kind of start consolidating that. Sarah, do we need to have a co-chair or a TL kind of help run this? I think the last time I bought into it to kind of help with this. Awesome, yes, we've got Brandon involved. Yeah, and so Justin and I will like participate as needed. And yeah, Brandon agreed to be the facilitator of the effort. Totally pragmatic question is, is this nine to five Eastern time block kind of thing or how does the meetings operate? So for the working group at least like what we've done with the other projects is depending on the set of participants, we'll find a time there once for everyone. Yeah. Got it, okay. So far we haven't had a subgroup that it's impossible to find a time. Yeah. Theoretically it is, but in reality, we've found it to be not that hard. If it breaks we'll fix it. Okay, so Brandon's gonna facilitate with Justin and Sarah helping out. And if you can go ahead and comment on the issue 167, which is linked in the chat, that way Brandon and company can start getting together and doing the scheduling and consolidating all the comments and all the PRs and issues that are open about this topic. I'm happy to participate as well, if it's useful. Yep, most definitely. Okay, that went really quick. Does anybody have anything else about the assessments process that they wanna bring up right now? I have a quick question for Justin, actually. Do we, are we seeing a need to present key assessment to the TOC? Good question. I'm not sure at this point, but we should, quite possibly, yes. Okay. So in the past, Liz Rice has said, she always wants us to present. And so I think we should always, like things may change, so we offer and then they see if they can queue up. Okay, yeah, that makes sense then. Yeah. And that guy's doing a ping, I may get ahead to put it on the schedule. The next one that's likely going to be available is going to be probably later in October. I believe, yeah, let me go back and look at a calendar here. Yeah, it looks like the next available meeting is probably going to be October 20th. So plenty of time to be able to get that on the schedules. Yep, and we are still kind of, the PR was just ready for review, so I think it'd be a perfect timing as well. Yeah, if you wouldn't mind being able to put this into your normal updates next meeting where we have the SIG updates to TOC is October 6th and you are all about to get pings on the, come update your slides. So yeah, that'd be super. Thank you. What's the format? So is it repeating the presentation which happened on a SIG or not? It's you presenting a summary, right? There's a one slide. It's very challenging to go into that one slide, but the lead security reviewer is responsible for creating that slide. And then the presentation, like half the slide is the project saying their thing and half the slide is what the SIG says. So depending on how much time we're allotted, one person might give the whole thing, but it's important that a person from the project and a person from the review team be there in case there's questions on either. All right, let me try and bring up, find that set of slides so we can share that with you. It's a miracle of compression. Well, Brandon tries to do that. Did anybody have anything else they wanted to cover? I have an announcement. Go ahead, Sarah. So we have nominated Emily as a new co-chair and the official, and that's been, that's the current chairs, including Dan, whose official term has ended. We have a little not quite overlap. And the TOC liaisons have all approved this. And so there is a, the process is that two thirds of the TOC needs to vote on any SIG chair. So the nomination has happened. It's linked in the notes and in the Slack channel. So the TOC generally really likes it when the community says stuff. So please, specific comments are welcome, and chiming in on the thread. I think you might have to be a member, like sign up for the TOC list to come in. I just wanted to make sure everybody was aware that that's in process. And that's my announcement. Oh, and Dan's here. I did also want to say thank you, Dan, for Dan was one of the inaugural chairs of what was in, originally, the safe working group, secure access for everyone or something like that. I think we had different opinions so the acronyms did four. And then it became SIG Security as CNCF and Dan, it has been, I didn't know Dan when we started this process. So it's been wonderful getting to know you and working with you and thank you so much. Thank you, sir. Great to see you. So is that everything that we have for today? Yeah, I found the link to the slides. I put it in the chat. Okay, anybody else have any announcements? No? I will release everyone and give you all about 15 minutes. Thanks everybody. Thank you. You're welcome. Please, Emily. Hey, Emily, I will change the calendar invite to reflect the new passcode. So. Awesome. Yes, you shouldn't have to do anything but like we should watch this space for next week when like, hey look, Zoom is now gated on passwords, hooray. Okay, cool. Thank you. Should be fun, more to come. Bye.