 I will take it away. Welcome, welcome, ladies and gentlemen. Hola, Bilbao. Como está? Nice. That's all I know. I'm sorry. My kids have taken many years of Spanish, and I had to help with vocab. And that's about as close as I get. So my name is Chris Robinson, AKA CROBE. I'm 50% of your presenters here. My partner in crime is Mr. Dr. David Wheeler. And we are both participants and leaders within the OpenSSF, and in particular, the best working group. The best working group is an element within, it's a group within the OpenSSF. The OpenSSF, for those of you who don't know, is dedicated towards improving the security of open source software for everyone. We have a lot of different avenues. We do this. We do this through things like supply chain security, or helping communicate information around vulnerabilities, or the best group. We focus on helping developers and maintainers offer the best software they can, helping teaching them good practice and showing them where good tools are. There's a bunch of different working groups. So if you don't know about the OpenSSF, feel free to talk to myself, David, or any of the folks. We're glad to chat more about our different initiatives. But we are a working group that reports to the Technical Advisory Committee, which reports out through the governing board. Ta-da. Look at that. Fancy, fancy. So within the OpenSSF Best Practices Working Group, we divide our work into roughly three different categories. We have projects and SIGs and documents that help identify good security practices. We have another set that helps developers and maintainers adopt those good practices. And then we also focus a lot of our time around learning, so education. We have a class, a secure development fundamentals class. And we have a platform called SKF, which just got renamed. They changed the name. But that's a hands-on developer platform. So if you're taking the class and you're learning about SQL injection, you can go do some hands-on labs in SKF and learn how to not commit those errors to your code. Pretty excited. So we're going to talk about two of the software projects that are part of the best working group. I'm going to talk about security scorecard. And yay, we do have representatives from the project here. So if you want to learn more from somebody that actually knows something, you can talk to Steven or any other folks that do scorecard. And then David's going to talk about the best practices badge. So scorecard is a tool that allows two different kinds of constituents. It allows a developer or a project to understand how they rank up against a set of automated criteria around documented security good practice. And then it also allows consumers, so if I'm an enterprise using your project, it allows me to understand the security qualities of this software, this dependency I'm trying to pull in. And ideally, like my friend Ryan Ware gave a talk yesterday, my company Intel has committed that we are leveraging scorecard as part of our open source management practices. And we're putting in the process of putting all of our external repos in scorecard. And the process is good and bad and ugly. There's some great things we've learned. We learned a lot about things we're not doing well. So from a developer standpoint, it's helping us improve our processes in kind of a Kaizen mentality. But scorecard automatically scores projects on a scale of zero to 10. And I think there's 18 or so different elements that looks like it. And nightly, it'll run, nightly or weekly? It runs weekly and it scans like a million projects right now? Yeah, quite a lot. So if you're curious at looking for a project that these folks are committed to security good practice, take a look at the scorecard and you'll be able to get some more information. So it has the ability to do an aggregated weighted score. There are scores in different areas. And as with anything open source, patches and improvements are always welcome. My friends at Intel will be working to collaborate and to add some more ideas and hopefully some development power to help make it even better. They just recently instituted a REST API to query the data. And by far and away, the most exciting new request was initially the project was developed within GitHub, the GitHub environment. So it worked really well there because the team knew a lot about that ecosystem. Well, we just recently had a community member that said, I'm a GitLab person. And they were able to work with the scorecard team to get the improvement to add now GitLab capability, which is awesome. And it is scorecard, even though we're looking at different security qualities, but sometimes you may see some of the older literature for it as scorecards. That's wrong. Don't be wrong. So thinking about, I'm gonna give you just a quick oversight of some of the checks that you can do. First off, they do a source risk assessment. They're looking at several different qualities, like is branch protection enabled? How many contributors does this project have? Do we see any kind of traditionally dangerous workflows? Look at maintenance. Now how often are the dependencies updated? What's the license? Is the project maintained? Does this project have a security policy? Which is important, if you're going to ingest something as part of your own work, or as a consumer you're gonna enter into your organization, you wanna understand how that project supports the product and how they will react when there's defects or security violence reported to them. There's a CI test where we look at, how do they do CI testing? We look at are they doing fuzzing? Are they doing static code analysis? So these are all things that are externally and automatedly visible, that they can go out and probe the project. So there's not necessarily a person sitting there looking at all this. So it's ideally, you trust the computer, is the theory behind all that. We also look at the build risk assessment. So is the project pinning dependencies? Do they publish packages as part of a CI system? Are they doing signed releases? These are all elements that as consumers are pretty critical to understand so you can assess your own risk of using the software. How do they manage vulnerabilities? Does the project have a list of unfixed vulnerabilities? How long, I wanna do some investigation if the project hasn't patched vulnerabilities in a year, for example. And then our next part of our talk is gonna talk about the best practices badge. So this is another project that a project can go through to show off their security acumen. Do they have security training? Do they understand some of these good practices? So the scorecard links up with the best practices badge. So if the project has attained that badge, it'll be in the criteria that it's looking at. Our friends at Sonatype do a lot of reports and they actually cited a scorecard in their 2022 state of the software supply chain report which is pretty great, really nice. And they note that looking at scorecard they saw some trends and they noted that if projects are doing code review they typically are going to have less vulnerabilities or be able to react more quickly as vulnerabilities are found. They're looking at checked-in binaries. They looked at pin dependencies and branch protection. So our friends at Sonatype called out all these as good characteristics and they trumpeted the work of the scorecard team. There's a better stat, please Eddie, what's the better stat? So to restate the presence of scorecards and maintain to upgrade was cited as a leading indicator for the security of a project. Pretty exciting stuff. The results can run automatically. You can either, you can integrate it into your own workflows and there's a GitHub action with which you can integrate that into your own project. So as you're going through and doing your work the scorecard will go through and do its analysis. You also can run it manually via the command line. There is an API that allows you to pull data and there, oh a big data public data set. Oh man, big data, that's very exciting. And if you know of a project called depth.dev they also include the scorecard results in there. So that's another way where we're kind of triangulating our work and being able to show off the results. So again, if a project is following these behaviors is in scorecard, there's numerous locations that's gonna get cited to help increase the eminence of your project and your work if you're doing things the right way, you know, securely. Oh is it? So the astute audience member noted that there is a plural scorecards in the API URL which the project team assures me they're working on updating. So it will be singular in the very near future. Nice catch, if I had a prize you would just win it. This is what I wanted. So if you're running an open source project, scorecards is again publicly available in either GitHub or GitLab. If you have another source code repository that is your favorite, feel free to work with our scorecard team to see if that's something you can get put on the roadmap to integrate. There's some changes you wanna make so you would set up the GitHub action. You need to, you're exchanging keys, right? You need to set a token or something for the authorization to be able to scan your repo. You need to modify your readme and then you sit back and wait for all the good results to come in and understand how you wanna react to them. So depending on how you develop, there may be a scorecard might have a different opinion. So you need to kind of understand what you might wanna do, where you might wanna prioritize things in your own project process to bump up your score, if that's desirable. As we mentioned, the GitLab integration, which is about a month, month and a half new, that was one of our most requested features for folks that live in that ecosystem. We also just changed the license for the data, so now we have more clearly stated what you are allowed to do and how your data is used. We're looking at improving automation, to get better CI pipeline support, and there's some new metrics the team is working on. They just recently worked with the foundation to publish a roadmap for the next 16, 18 months or so, and they've got a pretty exciting backlog, and the MTTU Mean Time to Update is one of the things we're looking at getting implemented, and just generally looking at cleaning up things, making the code more efficient. And as I mentioned, we have not only Scorecard project members here in the room, if you're curious to learn more, you can either through David and I, or through those project members, learn more, and I'm gonna turn it over to Mr. Wheeler, and I'm gonna throw out a challenge. We did this presentation in Vancouver, so our friends in Canada, as David was talking about the best practices badge, we actually had several developers in the room, and they went through and started their best practices badge journey, and I think one fella got almost all the way through, like 90, 99% done in the session. So let's talk about best practices badge, it's a nice counterpoint, and one of the key factors in Scorecard, so David, take it away. Thank you. Thank you. The mic room's phone's up here, so I'm temporarily trapped up here so that people can hear and record. So, I'm sorry? I'm being trapped. Yeah, I prefer to wander myself, but you know what? I wanna make sure that everybody, including people who watch the video, can hear the information. So, Scorecard and Best Practices badge work together, and I'd like to hopefully clarify a little bit about how that happens. One of the really awesome things about Scorecard is that it automatically measures things. Tool, you run it, you don't need any cooperation from the project, off you go. But unfortunately, like all tools, it has this fundamental problem of false positives and false negatives, and it can only measure what you can measure automatically. That doesn't make Scorecard useless, far from it, makes it really helpful, but there are some challenges with that approach. So, Best Practices badge takes, in some way, a different approach. It's very much kind of an interview style. Here are the kinds of things that we want to see. In some cases, we can automate that, which case we do. In some cases, we can't. In that case, we're looking for feedback from the project of how is that going? All right, so the Best Practices badge identifies a set of Best Practices for open-source software projects based on existing well-run projects. And if they meet certain criteria, they get a badge. It's not just a fill-in-a-form, as I mentioned already. There is some automation where we can, we automatically detect. In some cases, we can be certain of it if someone makes a false claim, we'll just reject false claims. In many other cases, we think it's this, we'll help you fill it in, but if it turns out that the estimate's wrong, that's fine, you can override as long as it's not certain that your claim is wrong. There are a few badge levels, passing silver gold, it's available in a variety of languages. And participation continues to grow. We now have over 6,000 projects participating in this. And over 1,000 that have actually earned a passing badge. Further, it has a number of criteria. I'm not gonna go into detail about every single criteria, but instead just note that the whole point of these criteria is they've gotta be relevant, attainable, clear, and consensus of developers and users who've worked with open-source software. And in general, we work hard to not be dependent on any particular technology. We try to be as agnostic as we can about technologies, product services, and so on. And it doesn't cost you anything. Perhaps most important, you don't have to do everything all at once. We expect people, frankly, to work incrementally. So we use the usual ITF terms, a must, a must not, should and should not, may. In the case of musts, they are required, shoulds, normally required, but if you can justify it, that's great. We have found that in a number of cases, there are things that are good things, but there are so many different reasons that you wouldn't be able to do it that we've put them in another category called Suggested. The requirement is you need to have thought about it and answered the question. If you say no, that's okay, at least you thought about it because for many projects, you should be doing it and for many others, it doesn't make sense. And we don't want to require something that doesn't make any sense. Hello. Well, that's not good. Ah, sorry. That happened to you yesterday. Well, I didn't want it to happen to me. Ha, ha, ha. All right, let's see here. Well, we can do this, this, the old school keyboards, they work, yay, all right. Okay, so I mentioned earlier, there are three levels of the badge, I guess a fourth in process, okay? But you can get passing, silver and gold. Passing is in some sense, what typically well run projects should follow. However, it really is an achievement to get this. What we found is that people generally read their criteria and say, yeah, that makes sense, we should do that. But when you say, and you have to do all of these, the set of projects that do all of them is far smaller than the projects even though most projects do almost all the criteria, for any one criteria, many, many projects will do them. But doing them all, that's a harder step. And frankly, that's exactly why you want to have these kinds of forms and lists. So, oh, wait a minute, I forgot that. Silver is much more stringent, but is still expected to be achievable by a single person project. And gold includes criteria that we know you can't achieve until you have multiple people on the project. Ideally, you would like every project to be gold, but the reality is that a vast number of open source projects are single person projects. And we just, you know, we can't require certain things like, hey, second person review when there is no second person. All right. I'm not gonna go every single criteria, but I do want to just give highlights. These are the basic categories of the passing badge. You know, basically things like, hey, this is supposed to be about open source software. So it actually has to be open source software. You need to have a version control repo. We still see projects not doing this. Shame on you. You know, you've got to publish the process for reporting vulnerabilities. This is, by the way, one of the most common ones that's missed is, now, how do I report a vulnerability? Well, it's a secret. That's a terrible thing to keep secret. Please don't keep that a secret. Quality, you've got to have at least one test suite as you add a new functionality, add tests to match. You need your software developers to know common kinds of vulnerabilities and how to prevent them. And you've got to use at least one static analysis tool. Silver, things like, hey, find a way so that if someone, if your main person dies, find a way to move on. It could be as simple as put some information in your will. But sadly, we are all mortal. And of course, you need to check all your inputs from an untrusted source. If you know you're getting data from places you can't trust, check that data first. That doesn't prevent all attacks, but it prevents many. Gold, things like, you need to have two unassociated significant contributors. Unassociate means two different organizations. And the rationale, oh yes, question. If you go there, they'll click on a button called Details and it will give you all that. Yeah, in fact, we had to, you'll be unsurprised to know that we had to interpret, we had to anticipate questions like that. So we try to make the short text really clear and then there's a Details button that gives you, now what do you mean by that word? Whatever the word is. And if it's not clear enough, please contact. We have a group of people who get to, you know, mull and discuss and argue and try to come up with a reasonable resolution. We want to see two factor authentication. Unfortunately, developer accounts are increasingly getting popped and attackers are doing bad things with that. And so thankfully folks are moving more and more to 2FA. And you need to use some hardening mechanisms. Okay, let me compare the scorecards and best practices because some people think that these are in contrast, but I lead the best practices badge work and I'm heavily involved in scorecard. We are not opposed to each other here. We are working together. Scorecard is 19 checks. In contrast, just the passing one has 67. And then when you add silver and more for gold, clearly there are things that the badge is asking for that scorecard can't. And again, this is partially the focus here. Scorecard is very focused on what can I quickly tell you completely automatically? For a lot of folks, that is incredibly valuable information. But it means that you can't do certain kinds of checks because I don't even know how to do some of these on an automatic way. Certainly with... Now, starting with scorecards, I did a quick... It turns out there are a couple of direct maps, but as you'd be unsurprised to know, their focuses are different. So this does not mean that scorecard's bad at all. Scorecard is great for what it does. Best practices is one of the criteria in scorecard because it can actually support and help them move on, help for those others. I got an interesting quote. I'm actually going to point out OASPZAP as a fun example because they were one of the projects that worked on this. They eventually did get a badge, a passing badge, but the challenge that they found was that they meant almost all the criteria except they had no tests. And before you laugh too hard, that's actually not unusual both in open source and proprietary is, yeah, we should do tests. Nobody argues that you shouldn't have any tests, but actually making a test suite. Wow, that's a lot of work. But they wanted the badge, so they finally added automated testing and they're saying, we're so glad that you kind of made us do that. It's not that we made them, it's that we forced them to confront a weakness that they acknowledged, let's point it out, and resolved and that's better for everyone, better for everybody. If you develop open source software, go get your badge and we have a new domain, www.bestpractices.dev, yay, thank you. More work than it should have been. So basically go click on that, get your badge and you can go start. Don't worry about silver and gold, just focus on passing to start with, as I said, it's an actual achievement because many, many open source projects get many and then, oh yeah, I don't do X and your X may be different. But start, do things incrementally, not doing things at first is okay. That means you know where you stand and then you can slowly work things off as you can go. Some tips, you know, we're kind of running short on time so I'm gonna give a little short shrift here, the slides will be publicly available here. But things like, you need to tell people how to report vulnerabilities. For most people the answer is, here's an email address, or if you're on GitHub, they have private reporting. That's a recent addition to GitHub and yes, I and many other people have been nagging at GitHub for years so thank you GitHub for doing that and then make a security.md file that tells people how to do that. Whatever it is, this is how to report it. You know, hey, you wanna know how to develop secure software? We have a free course, that's also part of Best Practices working group. You don't have to take that course, just learn what the common problems are and how to prevent them ahead of time. Have to have a test suite? We don't actually care what your test framework use. We just say, please pick one, okay? And that makes them test static analysis tools. So, you know, we just moved to bestpractices.dev. We plan for more automation in particular, Scorecard are already as is one of the criteria, the Best Practices badge. The plan is to go the other direction as well to bring in the data from Scorecard. In particular sense, there's been a lot of work on Scorecard to improve false positives, false negatives. It's really, you know, just those incremental improvements have aggregated over time. We're really excited to be able to do that. And yes, we are gonna update that logo. We have some drafts, we just need to turn it into a reality. And code cleanups because there's always need for code cleanups. So, we had a, yeah, we actually had a student who looked at just, you know, badging in general and analysis and things like Best Practices badge and Scorecard and said, and basically found that, wow, these things are really helpful. They really help both the projects, figure out where they stand, and other folks were thinking about looking at open source projects. And so, if you are using open source software as a dependency, you know, which one should you use? The answer is yes, okay? They have, they both have their advantages. The great advantage of Scorecards, right away, you get some answers, okay? And not just about your own, but any other project. That is tremendously powerful. All tools, and this is not unique to Scorecard, all tools have false positives, false negatives. So, when you get Scorecard's results, if they matter to you, double check, okay? And by the way, another thing is, you know, please file an issue, because we're trying to burn those down. We've made tremendous strides, but improvements is always a possibility. Best Practices badge, you know, doesn't even have a badge, or is it at least working one-one? That's a great sign, okay? And we are working on dashboards of other things to get this information and other related information even more available. So, I guess, so there's our contrast. Well, and I was gonna say, some additional things on our collective roadmaps. As I mentioned that the Scorecard team put together 16 month or so roadmap where they're planning their work over the next year. We'll be doing the same across all of the best practices projects. And one of the things, let's see, can I get P? David mentioned, so last year we published a list, a concise guide to evaluating open source software, and a concise guide to developing more secure software. So we plan on taking those two artifacts plus a third artifact. We just released a Best Practices guide for source code management repositories. So we're gonna take these guides and see where it makes sense to integrate into badges or Scorecard and then file the appropriate PRs and start the work on those types of things. So we're constantly improving and both very active and dynamic projects. And then you wanna bring it home with your cookies? Sure. So, yeah, sir, I gotta do this mic thing. All right, get back here for the mic. All right, so basically each of them has the pros and cons. I've emphasized several times because I think it's important to understand that each of these has pros and cons. That said, and frankly, one of the Scorecard's heuristics is achieving a Best Practices badge, but Scorecard, false positives, false negatives, we've made tremendous strides in them. And that said, we're aware that there's some tools we don't detect yet for static analysis and dynamic analysis, some CI systems. We didn't support at all. We've now added, but there's still some improvements. Originally it was only GitHub. Now, Scorecard supports GitLab as well as GitHub that covers a tremendous number of soft projects. We know it's not all of them, but again, we are making tremendous improvements. The Best Practices badge counters false results by requiring a human. But that has its own negatives because now we have to deal with humans and we counter some of the risks of, hey, well, some of the risks of humans are things like they lie sometimes. But we do things like the answers are required to be posted publicly. We do some automated checks to see at least a way that's just not true. We do actually have some human review, particularly if somebody notifies. And fundamentally, our view is that these two work like chocolate and peanut butter. They work together nicely. Please enjoy them. And the TLDR of both of these projects is patches are always welcome. We are trying to improve developers' lives, increase the eminence and renown of projects by showcasing the good work you're doing. If you have a question or you wanna see a change or there's a defect, please, PRs and issues open. We want that work and we're trying again, if you have other suggestions on how we can improve the quality of developers' lives and maintainers, let us know if there's other artifacts we can assemble or tools, we can help share with the community. And we've got about 10 minutes as my phone goes off. We can take questions. So what questions do you have for us? We have some additional experts in the audience. Take it away, sir. Okay, let me repeat it because I think we're getting recorded and I don't think what you just said is going to get into the recording. So he asked what's the story for versioning of the best practices badge specifically? The quick answer is there's actually a governance doctor talks about this. But the quick answer is that it's a non-trivial amount of effort to get a badge. And so we don't want to just cut people off of the knees with a sudden rapid change. So basically the idea is that our announced plan is at most annually, we will change the criteria. And the way that we do that is we add criteria but whenever we add new criteria we mark them as what's called future. And future criteria are never graded. They have no impact on whether or not you pass or silver or gold yet. And the idea is that we want to add future criteria and give people time to respond and say, yes I do or no I don't but I'm gonna change that and then go make those changes. So the idea is that if you have achieved a badge we want to give you time as the criteria change and we only want to change them at most annually. Now, some people have noticed that Keith there hasn't been a change in the last two years. Right. And that's actually, I guess a malice forethought I don't know how funny your phrase but right now we've been focused more on getting people to the passing level as it is because while we could always improve and add criteria right now the ones we're most worried about the most projects most of the ones who are aren't even achieving the passing level yet. We were originally planning to add new criteria this year I guess we could technically still do that but I suspect we're probably going to do it the following year because as yeah basically we're going to go through that process. I can't commit I guess yet that's so far in the future but we do want to occasionally do updates but we're aware that that causes some angst. Scorecards is a little different because they're running everything automatically so they very much just make the change and here's your new result but they're trying to not be just arbitrary as well they very much want and they don't want to just make too many changes they want to give people time to adapt and so in both cases there's a we need to not just stay static forever that doesn't serve but we also need to be mindful as we make changes so that they don't negatively impact the people who are trying to do what's good. A long answer to a short question I'm sorry did that answer your question? Okay. Oh we have a microphone. I think you may have been up first. Okay all right mine's a short one is the best practice for the best practices repository to refer to it in the singular or the plural and is that consistent with other projects which shall remain on night? There are many best practices so we that project will remain plural. Well it is best practices badge so we have a plural and a singular which means everyone's happy. For the Vermarians out there. Okay you talked about checking that the release was signed but nowadays many projects distribute containers and they're not signed even if it's a security tool and in many cases I found multiple vulnerabilities in the containers. Is this something on your radar for the future? Yeah to be fair I view container images let me be specific not the running container but a container image as far as I'm concerned is just another package format. It's a way of distributing bits that can run. That sounds like a package to me. So we really don't distinguish between say a system level package and a language level ecosystem package like PyPI or a container or a virtual machine image which is another package format. Now to be fair for both scorecard and best practices badge a lot of their focus is much more on even earlier on the source and committers and not necessarily the build. So many of the criteria actually it doesn't matter but for the ones that do that involve build then I would say it doesn't matter if it's a container format or any of the other formats it's you're building it please follow the good practices for build. And that's also an opportunity for us to collaborate with the other foundation projects like Salsa or S2C2F that are focused on that build piece and consumption piece more rigorously. We're focused on kind of the origination generally. Additional questions? Just a quick mention that I fixed the scorecards best practices badge while you were talking. System works. Yeah, I'll say thank you. And for those of you others who are leading open source projects the gauntlet has been thrown. A question as a consumer phasing application like customer phasing application developer basically are there already initiatives or tools to get the visibility into all those transitive dependencies down the stack and other tools that already analyze and leverage all this data for customer phasing apps? So specific to best practices badge there isn't tooling that enables that but things like scorecard and then more frameworks like Salsa or the guac project probably is a better way to kind of look at your dependencies and give you that output. And Mike Lieberman actually can talk about the amazing guac project. Just real briefly, yeah so guac which is a database for all this stuff can actually ingest all the data from the projects that you are using in your supply chain and will also ingest stuff like best practices scores as well so you can go back and like look at all of the transitive dependencies and what their scorecard scores are and all that. We're almost out of time. Yeah. Yeah I just wanted to sort of shout out about this cause I think conceptually if you haven't used any of this people see security and they assume that it's a security tool. This is security compliance very specifically. If you don't know how the command line works you could still do most of the best practices badge. It's understanding and operations. And I did wanna do one quick shout out. There will be a security slam in CNCF before the next QCon where we all reach out and we make sure that every main pained project has all of these best practices. So I guess my last note is on that if you want to have specific best practices around your architecture or your style of working go out and just talk to other developers that are like your project. Thank you. It's great advice. We have time for like one more question and anything else? Thank you for the overwhelming volume of attendance. We appreciate your attention and we hope you got a little something out of it. Thank you. Thank you so much.