 Hello, and welcome to Checklist for Aviation Vulnerability Disclosure. Don't go it alone. Before I dive into the talk, I first off would like to thank the Aerospace Village for allowing me to come speak today. They've been a great partner with SZA over the past year and I really appreciate them allowing me to come and chat with you for a few minutes. So I am with the Cybersecurity Infrastructure Security Agency, commonly referred to as SZA. You may know us from some of our previous branding such as US SIRT, ICS SIRT, NKIC, or even NPPD. All of those functions, brands, and identities have been rolled up into a single agency within the US Department of Homeland Security. So what do we do? We essentially function as the nation's risk advisor. We do national risk management for cyber and physical infrastructure. So kind of to break down, how do we do this? We leverage four major program areas. First, federal network protection. We augment the protection of the .gov web space, infrastructure resilience, and field operations. If you're in the US, you may have crossed paths one time or another with our CSAs or PSAs, our cybersecurity advisors, and protected security advisors. And we also have emergency comms. This group supports the COM ISAC and that pretty cool PSF2 function. And most importantly, the final group that I skipped is the Comprehensive Cyber Protection Efforts. This is where I fall among the many functions in this group. I have to facilitate the disclosure of industrial control system vulnerabilities. So, and to kind of sum up all of this part with who I am. So I am Jay Angus. I'm the federal lead for the Industrial Control Systems Vulnerability Disclosure Program. Since 2012, we've been disclosing on average about 700 to 800 industrial control system vulnerabilities per year. And these vulnerabilities impact vulnerabilities across all 16 critical infrastructure sectors. We try to provide timely, actionable information to remediate these vulnerabilities in as public a manner as possible. So we're going to kind of level set here. We're going to start off with, what is it? What is the vulnerability? What do I consider a vulnerability? And I pulled this from the internet engineering task force according to them through their RFC 49-49. It's a flaw or weakness in a system's design implementation for operation and management that could be exploited to violate the system security policy. I kind of generalize it as a risk that can be exploited by a threat actor, such as an attacker to perform unauthorized actions within an information system. So we break this down generally into two different categories. From my perspective, we have our zero days. These are unintentional vulnerabilities. Their risks previously unknown to anyone and typically require intervention from the vendor for remediation. Whereas we also have misconfigured devices. This is intentional or accidental. The risk is commonly introduced by the asset owner and it requires changes to the configuration on the device or the architecture surrounding the device. Either way, zero day misconfigured device. This is dangerous information in the hands of the adversary. And to continue this level setting, what do we consider a coordinated vulnerability disclosure? Just to kind of read the slide here. It's an effort to verify that accurate information is being openly disclosed on a vulnerability so that it can be addressed accordingly. I kind of use the airplane analogy at times. Much the way a vulnerability is found or a plane takes off and hopefully they both will make it controlled, safe to sit. You know, back to whether it's a public disclosure or a receiving airport. My team works to make disclosures of vulnerability in a safe, controlled manner. There's no guarantee that every disclosure will be smooth. But as for the aviation sector, as they begin to handle more disclosures, I think this is going to be much easier. And this will be an issue that we won't be as concerned about as time and effort continues to grow in this area. So to kind of help brighten this conversation today, the aviation sector loves checklist. It's a very common approach to managing safety and risk within the sector. And one of these checklists could make, you know, disclosures easier, whether you're a researcher or a vendor, and I kind of want to pull from the form published by the FAA, the FAA Form 7233-TAC-4. So if you're a pilot, you're probably very familiar with this, but it's the pre-flight pilot checklist. It looks like it collects a lot of information prior to takeoff and hopefully I won't get too lost in the jargon here, but I'm going to break some of this down and compare it to vulnerability disclosure process and see if we can find some parallels. So the first thing I noticed that this report is very interested in the weather. And what does the weather have to do with vulnerability to disclosure? To me, it appears to have the primary effort to document what the pilot has no control over, which I found to be ironic. As vulnerability disclosures, there are multiple parties coming together of significant influence that typically have no control or direct control over each other. They can certainly cause buffeting of each other during the course of disclosure. So as the researcher starts down the aviation research goal process, what is their end goal? Typically, their findings are focused on a safe and more secure aviation. For the researcher, I cannot stress enough, but know the company that you're dealing with as back to the idea that you can't change them. I have no doubt that you know how to manipulate their products and work with them very aggressively, but know the culture of the company. Depending on the company, it could be difficult for you to get your report in front of the right person. And similarly for vendors, you can't control researchers, but you can help guide them down a safe path so that you both have a control disclosure and sharing of information. And for the vendor, the first step is going to be that vulnerability disclosure policy. And for vendors and researchers, as I said, you can't change one party or the other, but you can certainly create opportunities to facilitate a control disclosure. So as we continue down the checklist, wins a loft. So this is the part of the flight that should be the most straightforward cruising or cruising systems are communicating as intended and parties have acknowledged each other's presence. The researcher has begun divulging to the vendor their findings and the vendor is reviewing the information, but we won't be here forever at this altitude. As the exchange of information progresses, researchers and vendors need to be open about their intentions and capabilities. Researchers need to be open of what they intend to do with their disclosure. Maybe even maybe they're going to take this to a talk at a conference. Maybe they're just doing this as an act of goodwill and don't require public acknowledgement such as CBEs or press releases. The case may be the party should be open about their tolerance for attention during the process. We want to keep the flight moving, keep the disclosure moving along as long as smoothly as possible. To keep this portion of the disclosure smoothly moving, vendors need to be more open about the timelines about their product development life cycle. Offer up realistic timelines for remediation. Simply ignoring the researcher first off doesn't buy you more time and at times can make the researcher make a less informed decision about how they want to proceed in a disclosure. Sometimes we don't need to dive into long protracted timeline for disclosures. And we kind of want to break that down a little bit into two different types of disclosures here. So for misconfigured devices, frequently focus on leaving the information with the vendor and a lot of times the asset owner under the expectation they'll remediate the finding. It's likely this type of disclosure straightforward, relatively quick to resolve and there's no space for a CBE assignment. But that being said, if you are having trouble getting someone's attention, we can help it sizzle with that and making sure that the information is put in front of the right person. And my contact info later for that kind of disclosure. And then so second, the other type of disclosure we're seeing for zero days, you can expect that kind of a long protracted disclosure timeline for these. We've seen anywhere from 45 days for a typical IT vulnerability and at times to a couple of years for operational technology. Try to find a happy medium here. Researchers should acknowledge the amount of time that they spent finding the vulnerability and expect lag and remediation from the vendor. It's particularly for aviation systems as they have a much longer remediation process. I quote I hear sometimes from the aviation sectors that cost a million dollars to update the line of code, whether or not that's true. Recognize there is great efforts that could be required to remediate vulnerability. So here we are when I think one of my more important points for this discussion today. Who are you working with. So first off, I'll go ahead and list my favorite option SZA. Our ICS vulnerability disclosure program has been in place for eight years now, and we've assisted with disclosure of thousands of vulnerabilities across all critical infrastructure sectors. Most importantly, as a CNA, we can assign CBEs to new vulnerabilities, assuming it's within scope, and doing so will assist with better identifying vulnerabilities in the future. So and I've listed multiple entities here. All of these entities are more than happy to work with researchers, and I've seen them do so in good, definitely good faith. And for vendors, who are you working with. It's important that you have a relationship with the national service. You want to have that relationship ahead of time. And especially if you have a product security team that they're in a perfect position to be building those contacts and reaching out to the out to these the national search to build the relationships in advance. So, and also can kind of talk about here, when should an additional party be brought in to assist with a disclosure. And I definitely believe for all disclosures, anytime it's crossing international boundaries, I think it's, that's a great time to engage an external third party to assist with communicating with other entities and make sure your information is put in front of the right group. And the other situation, obviously for this discussion, aviation disclosures should always have a trusted third party involved. I think these third parties add some oversight and an additional party to make sure that the correct stakeholders and asset owners are being notified. So, hopefully, as discussed earlier, we've kind of talked around the factors, what does a landing look like for an aviation disclosure. Researchers from my experience want to acknowledge want to be acknowledged publicly privately, and want to see a fix for vulnerability. At times, it is just that simple. And especially for misconfigured devices. Many times vendors struggle with their own bureaucracy and lose valuable time arguing points of a disclosure, rather than focusing on remediation. My opinion is we do not see enough aviation specific vulnerabilities with CBEs being assigned. Well, I can appreciate the sensitivity of the sector that this quiet disclosure process can create digital debt that can come do over and over again as more researchers enter the sector and continue to have similar findings. While the quiet, while the quiet disclosure is a useful alternative, it will come with consequences over time. So, airspace restrictions. Make sure you're clear to do the work that you're doing. This should go without saying. Please, please do not do cybersecurity research against a system that is not yours, or you do not have clear written permission to have access to. If you have a question about the legality of your work here in the US, I encourage you to check out the DMCA. There is space for cybersecurity research specific to aviation devices. You just, it needs to be in your possession in your property. To be clear, while I do work for DHS, I'm not law enforcement or sector regulator, I can't put anyone in jail and I certainly can't force anyone to fix a vulnerability. That said, as a federal employee, I do have a certain obligation to report suspected crimes. Remember, I can definitely help researchers amplify their messaging around a vulnerability. And I have a team of really awesome vulnerability assessors that can assist vendors with researchers, assist vendors and researchers in describing the vulnerability and suggesting mitigation strategies. But I have to have that information put out in front of me and it has to be in a very legitimate legal format. So as I'm trying to sum this up. Don't go it alone. I think that's one of the most important parts here. Be prepared beyond the vulnerability. So many times I've seen researchers that know systems better than the manufacturer, but the researchers completely unprepared to acknowledge that they're working with a business that operates nine to five vendors. I can't help but push this message over and over again. You've got to have a public vulnerability disclosure policy. It makes it easier for researchers and vulnerability coordinators to put information in front of you. If you have a public policy, make sure you have an internal policy and staffing to support it, support it. You need a team to manage these types of risks. It's great that you want to take responsibility, but you must have accountability for these processes. And for both parties, be honest about timelines. I know it's hard. You can't take this process personal. One item that I'd like to see more of is I think tabletop exercises specific to your organization's business model are important. They can help identify responsible personnel and develop policy in advance of a disclosure and a great way to create a scenario that's more realistic is to maybe bring in a researcher that you've worked with in the past. They probably have experience doing disclosures in a way that could be difficult for their experience and bringing them in and seeing their expertise would be a huge opportunity for your organization to grow. So researchers and vendors, while you're working together, you need to be honest again about these timelines and plans for vulnerability. I really don't enjoy seeing researchers dropping zero days to the internet, especially in the aviation sector without a reasonable attempt to contact the company. But, you know, that just really doesn't build trust between the two parties in the long run. And I don't believe that it serves the greater good. And the same goes for vendors ignoring researchers. The lack of engagement doesn't build trust and at times causes these uncontrolled disclosures to happen. Finally, have a plan tailored to your goals, vendor or researcher. Both parties in this disclosure should have a plan in place for the disclosure. And as always, CISA is here to help. We have a great deal of experience with this. We partner regularly with SCI, CCC. We have direct access to the FAA. We partnered with the Aviation ISAC. We also have our ACI program to support with disclosures. So if you have questions about disclosures, here's my contact information. Feel free to send me an email and I will be in the chat box here after the presentation answering some questions. If you have anything, I'd love to support your disclosures. Thank you.