 Hello, everyone. And thank you for joining my presentation for a cloud native security con. And this presentation is going to be about the state of vulnerability in in cloud native security or in cloud native applications. My name is Magna Logan, and I work for Trend Micro as a security researcher. So just a little bit about myself. As I said, I'm a senior researcher at Trend Micro, also an information security specialist, currently doing cloud and container security research, mostly working with Kubernetes security and also open source security. I'm a member of the CNCF security tag team as well since October last year, and I've been doing container and security container and Kubernetes security research, as I said. I also have a personal blog that's called KatanaSec.com where I used to post my articles at least once a month. And there are also all my contact information there and all my previous presentations, either like slides or videos from previous talks. Since 2011, since I started speaking at tech conferences. So before we start, I just want to give you a quick overview of what we're going to cover in this talk. Basically, this is a project that I've been working on for a couple months now and still ongoing there I think there are some improvements required and we're going to talk about that at the end. But I'm going to explain to you the idea for the project, what are the CNCF security audits, how they're done and the list of the ones that have been performed so far on many different projects. And then we'll go into the details of the results of the data that we collected and analyzed. Basically, I'm going to explain the methodology that we use, the analysis that we've done and the results that we got, and also talking a little bit about the third party risks of some of these projects. And to conclude and summarize everything, I'm going to give some recommendations, some recommendations to the CNCF, but also some recommendations for organizations and users. And some next steps on how I'm planning to keep improving this project and doing as a kind of a periodic research. Before I start, I just want to say that give a little disclaimer here. The idea for this project is to generate awareness on the need for more cloud native security and open source security. The idea for this project is not to generate fun, as we call it information security, which means fear uncertainty and doubt around cloud native projects right so we just want to raise the awareness and show that security is important for this project, especially some of them have been used for many different organizations and we rely on them to run our system so it's important to focus on security as well. A shameless plug here before we start with the content. I have created this GitHub project which I call the almost awesome Kubernetes security list. It has a lot of content there around Kubernetes and Kubernetes security. It has videos, books and articles and everything you need to know. I think to learn more about Kubernetes and Kubernetes security at least to get started. So I think there's a lot of contributors there already. People from the CNCF have participated and submitted new links and new information. So this has been going for almost a year now and it's really incredible to have contributed to the community in that way. So feel free to start this project and fork it and submit new PRs if you want. I'm always available and looking for new content to add there. So how this project started, how the idea of this project started, right? I think in the beginning of this year I was helping the CNCF security tag team to update the list of security audits that was done on different CNCF projects. And so I saw that pretty much all of those security audits were PDF reports. Some of them were hidden inside of the repositories of those projects. So it was kind of hard to find and you had to dig through them. And especially since they were on PDF, there was no place that we could check the status of those vulnerabilities if there were any CVE reported because of that vulnerability that was detected. So those security audits are usually organized or directed by the CNCF and they hire third party consulting firms to test the security of different CNCF projects. As we can see here from the slides, there are many different projects that have gone through security audits, right? Kubernetes, Helm, GRPC, at CDE are some examples. And you can see as well the audit vendors or their consulting firms that have tested these projects. So basically the first thing that I did was, okay, I need to concentrate all these reports in a single location. So that's easy to find and I can consult them and query them in an easier way than just going through each project there. So what I did was I created a repository on my GitHub called CNCF Security Audits and each inside it folder that has the project name through is the PDF with the report and the security results of each security audit, right? Okay, that's good. But how do we go through each PDF report and kind of analyze all those vulnerabilities, right? As I said, the security audits are independent done by third party consulting companies. They're usually required for projects that move to sandbox to incubating. And they started in 2018, at least that's the first ones that we have details from. And they're usually one time only with some exceptions, right? And unless the project is very well known and very well used by the community, such as Kubernetes, it usually goes through another security audit, such as the one that Kubernetes is probably going through right now or it has a RFP open at the time that I'm recording this video. Yeah, so as I said, the result is a PDF report with the findings, right? And in security, that's fine. That's usually the artifact that you're expecting from a pain testing report, a PDF with all the findings. But for, I think for developers and DevOps organizations, we are way past that, right? We need a way to, a easy way to list out those vulnerabilities and also check their status and see if the problem is, it was fixed or not and who's addressing that, right? So, as a suggestion here, I'll probably give more recommendations at the end, but as a suggestion here, maybe having those finalized reports added as GitHub issues or something like that would make easier for everyone and it would give more visibility into those security audits. Okay, so how did I start this kind of project here? The methodology that I used was download all those PDFs. As I said, I created this GitHub repository to help me with that and it's available for everyone. I parsed all the PDF reports and I collected all the relevant data that I needed. I basically aggregated everything into a single location. Of course, I used Excel file for that because it was easier to see and compare the issues and aggregate results. Basically, in the beginning, I analyzed the data from all those reports and understanding the results and did some measurements on, okay, what is, what, which project has the most vulnerabilities, what are the most critical vulnerabilities and all that stuff. And then I decided, okay, maybe I could do more. And I also assess the third party risk of those projects analyzing their libraries and dependencies with open source security tool called SNCC. One last note here is that there were some projects, some security audits that were done that were specifically about fuzzing. And so there, the vulnerabilities reported were very specific to fuzzing and buffer overflows and stuff like that. So before this, this version of the project, I decided to, to left out those vulnerabilities otherwise you would kind of, I think would mess the data and the comparison there because not all projects have had fuzzing testing. Okay, the analysis, I'm going to show some graphics here in a second, right, basically comparing the project numbers, the project with the number of vulnerabilities, the vulnerability categories and types as reported by the Consonty firms. And also comparing vulnerabilities and severity so that we can see which the number of vulnerabilities that we have as low, medium, high and critical, right, and then I'll go into the open source security or the third party dependency. And as we like to call software composition analysis, right. Okay. So moving on. This, this presentation has some of the main results and graphics. At the time of the recording of this, we're working on finalizing the final the PDF report that we're probably going to publish together with this talk when it's published when it's released during during the cloud native security con. So you're probably going to receive a link for the full details for the complete PDF report with my with all the details that I found and that I wrote about it. Okay. So the first thing that we analyze here was which project had more vulnerabilities reported to it, right. And it comes as no surprise that Kubernetes came out on on kind of the top with the most vulnerabilities reported to it. The largest of those projects and the most adopted of them. And, of course, it has more lines of code right so and does more chances of vulnerabilities being found right there are some studies that have reported some researchers mentioning that there is at least one vulnerability per 1000 lines of code in in any kind of software right so these of course this is an average but at least we know that the more lives of code the more chances you have of finding vulnerabilities so that's the given In second place, there is at CD, which is also part of Kubernetes by default, and and how much is the package manager for Kubernetes projects as well in third place. So that's that's quite interesting but you can see really see the difference between Kubernetes and at CD here in second place and the other projects as well so that's something that we need to be aware of. So most of these vulnerabilities were fixed or their risk were accepted so of course make sure that you're using the latest version available to you for your Kubernetes projects. There is just a drill down off those numbers right how many vulnerabilities were found by each of them. As we can see, Kubernetes has more than double from the second place from at CD, and so on. I just showed a few of them here. Okay, so before I go into the next next graphic, we need to talk about the classes of vulnerabilities so one of the vendors, I think was trail of pits, they in their reports they classify the vulnerabilities according to these classes, and we have control auditing login authentication and all that stuff, and they, they give you one a description of each, what each class means right. Some of them, some of the other consulting firms didn't adopt this this kind of approach. And so we, we found it interesting and we use this kind of classification to the other vulnerabilities as well according to the description of the vulnerabilities. We assign a specific class or category to that vulnerability so that we could compound all the data and create some some evaluations there. And basically, the, the vulnerability class or category that has the most number of vulnerabilities is data validation, right, and as we can see here, it's, it's way over the second place as well which is configuration and data validation relates to either input validation or lack of verification and off data, it could could cause also like things like buffer overflows and heap overflows and all that stuff. So, it's really interesting to find that a lot of these projects have data validation issues right which is something that can be easily solved by adding proper validation in your code and adding proper exceptions and try catches and all that stuff right. Okay. The next one here is the severity right. So, the vulnerabilities per severity, we found that that some of them actually most of the reports the projects only had like low and medium vulnerabilities which is good. Right, which I think it's reassuring for us using those projects that they don't have like few only very few of them had some critical vulnerabilities down there at the bottom. I think it's interesting to note here that despite the great quality of the consulting firms that have tested these projects, even though they're kind of well known in their field, they couldn't find a lot of critical or high vulnerabilities in all these projects right so that's that's interesting and something to note right so, so I think that's that's important to highlight here. Okay, there is more reports and analysis that I've done, I couldn't show everything here in a 25 30 minute talk. So, in the interest of time, please take a look at the the report for more more details and more analysis. So, the last thing that we did was the open source security and dependencies here. Right, when, when talking about cloud native security, we need to be aware that most of these projects rely on libraries and dependencies. There are also open source softer, right. And these libraries are usually incorporated during the development lifecycle and rarely get updated or checked against non vulnerabilities. So in this section here we analyze the dependencies of these projects and look for outdated and or vulnerable libraries and licensee risk that they might be using in their code. In this analysis we use the micro cloud one open source security which is powered by sneak. And as we can see here from the results of all I only tested the projects that had security audits done for them right so I didn't test all the CNCF projects so that we can make some comparisons here. Again, Kubernetes shows on pretty much on top. If you look at the numbers there is the project with the most vulnerabilities in its dependencies, besides just notary that's on top here, because it has a critical vulnerability as well. Right. And so that comes as no surprise as we found on the on the previous analysis of the security audits as well. Again, in the, in the second and kind of third place there on the number of vulnerabilities the total number of vulnerabilities off their dependencies, we have harbor and Vitas. So it's interesting to note that, although these projects have access to some open source tools and I think they use on the CNCF they are they have kind of their own scans for vulnerabilities using sneak as well. There's, there's a lot of of issues here. I don't mean to say that all those vulnerabilities are valid and exploitable, but there is some analysis that needs to be done. And to see probably there are some issues here that need to be addressed and fix and some of them could be just updating the library to the latest version. And so, some of this should be prioritized and fixed. Here we can see the vulnerability risk is increasing over time. Regarding the libraries and dependencies right as we can see, as as time goes by, and new vulnerabilities are found and the libraries gets outdated. And more issues are raised and the kind of the risk increases for this project right so the quickly we address those issues, the faster we can have it fixed and reduce the risks of this project being compromised. So, to summarize everything I think I for this for this project in this report. We came up with some recommendations for the CNCF, and especially the security tag team, which I'm also part of and so I'm going to recommend that to them as well. But first thing I think would be to improve this overall quality and security of those projects is establishing more periodical assessments. At least once a year or something like that, if that's available and if of course if there is budget for that. I'm going to implement a CNCF buggy boundary program someone doing this analysis, I found out that only one project from the CNCF has a public bug bounty program which is Kubernetes. Besides that, all the others they only have like a, what we call a VDP or vulnerability disclosure policy. And so, maybe that would increase the number of reports and vulnerabilities that get reported to these projects. So what we noticed when we're trying to do the RFPs for the new Kubernetes security audit is that we had a hard time finding consulting companies and vendors to that were interested in doing that. So we're really promoting that those RFPs even more and have that available and kind of publicizing it for different organizations and maybe in different countries would get more RFPs for those audits. But it's hard to see if those issues reported on those audits, if they have been fixed or if they have any kind of CVE assigned to it, right? And so you would have to go through the code and analyze all the commits and everything and find these issues if there is something related to the vulnerabilities that were fixed. So making it easier and more visible to see these reports and status of each vulnerability, I think that's something that would be interesting that we can improve upon. Another thing is prioritize issues related to critical third party components, right? We saw there is at least one project that has a critical vulnerability there on the dependencies and there is also a bunch of projects with high vulnerabilities as well. So maybe trying to prioritize that and talk to the project maintainers to see if they could fix those issues, of course, without causing any problems on the projects and the applications itself because we know that sometimes updating libraries can be tricky and might break your applications. So we need to be careful there as well. Okay, so this, as I said, this project is the, I think would be, I can call it the first edition and I'm planning to do a second edition next year. But the goals and next steps here I would say that I can provide the data back to the community and share that with the CNCF security tag team. Basically all the data that I aggregated from the different security audits. As I said, I'll try to run this once every year and compare the results with the previous version as well. Also adding some static analysis and reviews for all the projects, right? Analyzing the code of those projects for security vulnerabilities as well. And check and see if there are any issues here that were unreported or unfixed and report those to the proper projects and hopefully get them fixed. So I think that concludes my presentation. I hope you enjoyed this talk and I'll be available for questions either on the Q&A section or on Slack. Thank you.