 Hello and welcome to my talk here at the Aerospace Village at DEF CON Safe Mode. MITM, the mystery in the middle. My name is Matthew Gaffney and I'll be taking you through an introductory level talk about the aircraft information systems domain, also known as the AISD. We'll be covering through these topics throughout the 15 minutes or so talk today, covering most of it on EFBs, if I'm honest. So why the mystery? So the reason I said I called it the mystery in the middle was due to the lack of public information I could find about the AISD when I started my journey into aviation cyber security several years ago. I could find lots out about how to meet Yonix systems, IFE and all about their weaknesses, but very little about the AISD. So that's the reason the main reason for the talk today is to try and improve knowledge and awareness of this part of the aircraft and generate the healthy discussion about security within. So logically the AISD sits in between the aircraft control domain and the passenger information and entertainment systems domain, both known as the ACD and the PISD. And essentially what makes an aircraft what's known as E-enabled. An E-enabled means that it will integrate, the aircraft will integrate with the airline IT systems to provide significant cost savings to the business. What it does is it allows remote maintenance and uploading of software updates, which is a key time saver, especially when you consider that previously. If you wanted to update the firmware on a module, you'd have to remove it, send it away. In the meantime, put in a replacement, then get the old unit back with the new firmware and swap it over. So being able to upload new software remotely without having to remove modules or LRUs, as they're known, is a considerable cost and time saving for an airline. It also allows the airline to download logs, both operational logs and security. So for airlines, the AISD is the most important part to get right for the business, at least that's my theory. An airline cannot have much influence on how the ACD is constructed. That may be a small amount of customization, but not very much at all. It's heavily regulated, it's heavily dependent on certifications, which are long, expensive and time consuming. So the AISD is where the airline, the operator, really has to get it right. But there's lots of challenges in doing that. There's, across all aviation, not just enabled aircraft, there's a lot of older technology that gets reused. And it's not really a surprise, because if it works and it meets the requirements, why change it? Certifying new modules is expensive and time consuming as well. So the older modules that are still fit for purpose will be reused a lot, because it's cheaper to modify them or to use the older ones. What this also means, though, is that when threats change over time, these older modules can be attacked in new ways, nobody thought about when they would can see. We're seeing this with avionics hardware today, but we are not really thinking about this when it comes to the AISD, and maybe we should. When I say we have new risks to manage, what I mean is engineering now needs to consider cyber and infosec risks. Infosec teams now need to understand and consider engineering. Traditionally, these are two areas of the business that have been completely isolated. And it has allowed engineering to operate in its own little bubble, which has led to engineering not following infosec policies 100%. And you see examples of bad practice such as post-it notes with passwords, or even passwords being written as a procedure documentation, because it's easier to operate that way when you're an engineer than it is to use a password manager or to remember the password off by heart. At the same time, IT and infosec are implementing systems for engineers to use without fully understanding the constraints of engineering. They don't always have really good 3G connection on-board an aircraft. They don't always have time to be able to learn new systems in depth. What they do have is very pressured time constraints in order to get aircraft back on the line, because all the time spent being fixed is money. But when IT comes along and comes along and implements a new IT system, say for example, they don't have a single sign on, all of a sudden that engineer instead of 20 passwords to remember, they now have 21. And then we ask the question, why are they writing down passwords? So understanding the constraints and the difficulty of engineering can go a long way to improving the security posture of an airline, and operators really need to get these departments working together in harmony. When dealing with engineering and E operations, you cannot bring the corporate informational cybersecurity mentality and templates to the party. They just won't work. You're dealing with an area of the business which historically has not been great with cyber. So you have to go slow. One thing that's on your side though, as an intersect person, is that aircraft engineers love processes and procedures. Oh boy, do they love it. Before you can even touch an aircraft on the ground, you must have a job card which outlines the steps engineers must take for each task. So if you can bring influence to these procedures to improve security, that's half the battle of one. And then finally, I should probably explain that last comment on the slide. This is my humble opinion and I know many would disagree, but I feel that when looking at aviation security, hackers should not focus entirely on adionics. Although the ACD in the adionics have a number of well documented and technical weaknesses, it does have a very effective physical security model. You can only access the adionics bay from outside the aircraft. So when the aircraft is in flight, not even the pilots can get to it. So only the engineers on the ground can do that. And if you've ever tried to get access outside the legitimate reasons, you know how difficult it would be to get outside the illegitimate reasons. Also, there's usually only one network interface between the ACD and the AISD and I'm talking about aircraft that are fully compliant with ANK821 here. But this separation is normally implemented with protections such as a firewall or even a data diode. And sometimes we also base that network communication on proprietary protocols, not on TCPIP. So if you can get into the ACD, yes, you can inflict a lot of damage, but gaining that access, either through physical or technological means would be extremely difficult. And if you could go to that level, if you could get physical access to an aircraft, why go hacking? You know, you can do something much more nefarious with a much greater guarantee of success, if that's what you wanted to do. So for me, instead, the aircraft information systems domain has multiple touchpoints ranging from the airline network, Wi-Fi, even the internet and the cabin. And these attack scenarios are much more traditional and realistic for a hacker. Okay, it's not sexy. We can't control the the aircraft control services from the AISD. You can have other effects on the aircraft, effects which may impact operations. And if you know what you are doing, you could modify some of the data presented for pilots. So manufacturer challenges. So I explained the challenge about technical debt. And the reason for that is that airworthiness time scales are huge and they encourage or they almost force manufacturers to really think about what they're going to reuse and what they have to rebuild from scratch, because those brand new modules are really expensive to get certified. Challenges from the operator are very different and in many ways are quite significant. The key one is they have zero visibility of the manufacturer's risk assessment. So when the manufacturer is designing and building an aircraft, they perform extensive and detailed risk assessments to mitigate system vulnerabilities. They will mitigate many of these issues themselves, but they will also transfer some to the operators. And this is communicated in a number of ways, such as security handbooks or hand solves depending on which manufacturer you're dealing with. But no operator will get to see details of the original vulnerability or the risk assessment. So despite the quality of this manufacturer guidance and these recommendations which come down, there's no real mechanism for the operator to assess the effectiveness of implemented controls. And this can, and from what I've seen, probably does lead to gaps in the cybersecurity protections of E&A World Aircraft at the operator level. When you think about it, airlines are heavily reliant on legacy IT systems. For example, many use at the core of their business a TPS system, which handles all of the real-time transactions. It's not a traditional operating system. It is based on technology that was released in 1979. To program it, it's done in hexadecimal. It's just not what we would, people in my generation would consider a traditional operating system. But it handles transaction messages one to one at an extremely high volume and extremely high speed. And what that does is make sure that the same seat isn't booked twice by separate transactions milliseconds apart. So, and there's nothing out there on the market that can really replicate that right now. Also, airlines over time have grown and have added new systems on top of the old systems. And until quite recently, this may have been done without consideration of all the risks. In fact, some of them would have been done before cybersecurity was really a thing. So, some may not have followed good practice such as network segregation. Airlines also operate 24-7. And that means that outages are very costly. When an airline IT system goes down, aircraft stop, they just cannot take off because they haven't got the information they need for the pre-flight. So, any introduction of a new system that could have an impact on operations is risk managed to the end of the degree. And there's a lot of risk avoidance going on when these new systems are put in place. Either that or you just throw resources at it to make sure that if something does go wrong when you introduce a new system, that the impacts are mitigated as much as possible. What it does mean is that over time, airlines can become quite flat in their networks. And what that means is if the perimeter is at a breach, it's very easy for an attacker to then pivot inside the networks. So, where regulators fit in. So, speaking from a European perspective, there's some really positive steps going on right now in the critical national infrastructure sphere of which aviation forms a part of. And maturity in cyber security, aviation cybersecurity is growing slowly and gathering momentum, or at least it was before COVID. So, last time I was really involved. From what I've seen, the UKCA is growing the cyber oversight function in a very positive way. And rather than just setting out a framework and directing airlines to comply, they are working closely with operators instead, guiding them and helping them through their assure process, which is really good to see. So, going back to this gap in the risk assessment, could the regulator help bridge that gap? The issue is sharing the risk assessments with operators as they are highly sensitive. Maybe some trusted government bodies could be given more details on the risks and help guide operators on the effectiveness of their controls without passing on that sensitive detail. If we can't trust the government, what about the aviation ISAM? Although they are a member organization, most of the large operators and manufacturers are members. Could this be another suitable mechanism to do the same thing? Whatever happens, I feel something does need to be done. Otherwise, pardon the pun, airlines could be flying. So, here is a rather simplified overview of a enabled aircraft. This diagram is based on compliance to the ARRIC E21 protocol. And as you can see, we've got the three main domains, the aircraft control domain, the aircraft information system domain, and the passenger information and entertainment systems domain. The AISD, as well as being a main communications hub, or hosting a main communications hub, will host the electronic flight bag if it's a class II. If it's a class II, the EFB will sit in the ADOX, more on that later. Some aircraft have their flight standard panel or FAP connected in the AISD. And there's also occasional printers, maintenance terminals, which can be USB or Ethernet port, depending on what kind of device you're using. If you're using just a straight up USB as a tool, or if you're using a portable data loader. And those maintenance ports are used as a backup for the Ipset tunnels that sit between the aircraft and the glans. So, these Ipset tunnels are really well secured, and the security is excellent. So, they are managed from mostly on board, but the ground systems are involved in the keying of the aircraft. And there's, we'll talk a bit more about PKI later on. What we do have is the EFBs, if they are connected to the AISD, they will often route traffic to the internet through the Ipset tunnel, through the DMZ and out into the internet, which is usually for software updates. And it's usually a very restricted set of websites where the EFBs can go. That is usually done on the ground as well, because if you imagine, if this is happening during the flight phase, you have the latency of the SAP connection down to the ground systems through the DMZ and out to the internet. So, the response time on that in flights would be pretty long, but on the ground it's manageable. One thing to note as well is sometimes in enabled aircraft, instead of having a SAP come in the IISD, there is an option whereby the manufacturer offers the operator the ability to use the passenger SAP connection. Remember, this SAP connection, the passenger domain, does not go through this Ipset tunnel here. It goes off through the service provider's network. And one option that some manufacturers are offering to operators is that the information systems domain here can connect to the ground systems, but through the same SAP com here. And that is quite advantageous in terms of cost. The more you use SAP com, the cheaper it is. So, if you have two connections, that's going to cost you quite a bit of money. If you have one, yes, you're going to be paying for more data, but because prices come down with volume, you'll actually be paying less overall. What I have seen though is quite promising that when this is implemented, the connection from the IISD into the SAP com is through a separate VLAN. So, essentially, that's a mini network segregation on its own there. And that data is not accessible from the IFE or the Canon Wi-Fi. So, moving on to my favorite subject, electronic flight bags. So, these have been a revolution for pilots. As the name suggests, they replaced the large and cumbersome and often heavy bags, which were used by pilots to carry paper-based manuals. These bags would often weigh more than 25 kilos. And as well as being difficult to maneuver, they also contributed to physical injuries of pilots as well. Moving those bags around the cabin, around the cabin, around the cockpit, sometimes led to shoulder injuries because of their weight. Depending on the type of EFB, they can perform a variety of functions, including a documentation reference library. They type B can perform what's known as performance calculations. These give V1, VR, V2 speeds, all that kind of stuff. These are important for pilots to know. On-route navigation if you have a GNSS feed into the EFB. And if you have the relevant connections, access to no-tams, weather information, and the ability to do fuel calculations as well. Some operators also include spoke applications that sit on the EFB. And these might be used to complement airline operation systems, so long as they reach the relevant approvals. But they will also help achieve the overall goal of EFBs, which is not just a reduction of paper inside the cockpit, but also faster access to information, which helps lower turnaround times and reduce workload in the cockpit. And non-pilots may not understand the importance of that last part. Many incidents of pilot error cite cockpit workload as either the main or as a contributing factor. So this makes EFBs not just an effective tool in reducing costs, but they also help meet flying safety. Hardware categories have different names, depending on which regulator you sit under and how long you've been in the business. They're often used together, so I've included both here with some general characteristics. So a Class 1 or portable EFBs, they're often small devices and double-uppers corporate devices at the same time. Think of a small EFB, a small iPad, and with no connectivity to the aircraft. They're then considered like a pad, a personal electronic device, and as such they must be stored for takeoff, taxi and landing. Class 2, these can be pilots or plane attached to the EFB. These connect to AISD with usually a one-way feed of data from the avionics. They can be integrated with the cockpit displays and their build is the responsibility of the operator. Quite often they are small laptops with operating systems such as windows and then the manufacturer will provide the software which runs on top of that and connects to systems on board the AISD. It's worth mentioning that Class 2 EFBs have very different threat profiles, depending on whether they are plane or pilot attached. So our plane attached EFBs are considered part of the minimum equivalent list or NEL and tracked as an aircraft part. They're always under the control of the airline with coordinated and tracked movements between locations. Pilot attached EFBs are not. They enter and exit the cockpit when the pilot enters or leaves the aircraft. They are very likely to connect to all sorts of networks and hotels, cafes and airports so that the pilot can update software, navigation databases, no-tams, re-corporate emails or even surf the internet. So you can see how the plane attached EFB has a very different threat matrix to that of a pilot attached one who may be anywhere in the world using insecure Wi-Fi. And then finally, I've included Class 3 for completeness because Class 3 don't really form part of the AISD but they are in EFB. So they are integrated into the aircraft control domain and they have no connectivity outside of that. The build is the responsibility of the manufacturer but it is maintained by the operator and what I mean by that is that the hardware is fixed, the hardware cannot be changed. There may be software updates of the time but these are provided to the operator who then have the responsibility of maintaining the right version. Class 3 EFBs are certified as part of the EFBs. So the operator has next and no ability to make changes unless they wish to go through the trouble and expense of re-certifying. And even though EFBs, Class 3 EFBs, so from the AISD, the ones I've seen are logically separated from the core avionics. And you have to interface through many different modules before you can bridge a gap between the control surfaces and the EFB. So attacking a Class 3 EFB to affect an aircraft in flight is still not that straightforward. Software categories. So Type A, really simple, they're just used for the storage and display of documents. And like I said, these are typically Class 1 classes, PEDs, and they get put away during the critical phases of flight. Type B applications, this is where it starts getting more fun. So like I said, they can perform performance calculations and they may access the internet on the ground and in flight and they may be able to receive one-way data from the avionics. They are also the most common EFB software type IFC. It's worth mentioning that Class 1 and Class 2 EFBs are built by the operator. And as such, they are not the responsibility of the manufacturer. Even though they may provide some of the software that runs on the device, it is up to the operator to ensure that they are working securely. For EFBs running Type B software, this is where things can get a bit blurred. According to regulations, any vulnerabilities in the software that could affect the operation of EFB software regardless of who coded it is the responsibility of the operator to mitigate. At the same time, details on EFB vulnerabilities do not seem to be passed onto the operator, just a series of recommendations for EFB security. Now, these recommendations are really good, but as we'll come on to later, I don't feel they go far enough. Type C applications can affect the control of the aircraft in flight. So this is the really juicy part. Essentially, they can send data to the avionics. However, at the moment of going to press with this presentation, there are no current approvals that exist for Type C applications that I am aware of. I have seen future offerings from some manufacturers about having a bidirectional connection between EFB and the flight management system, which sits in the avionics. However, I have not enough detail there to give you the information about it. As I touched on the previous slide, regulators provide operators with cybersecurity recommendations. Recommendations are also given by the manufacturers, but I feel that even when you combine these two sources, this advice does not go far enough. By that I mean that there is a lack of detail in what to consider and measures to take. When it comes to regulatory documentation, security appears to be the poor cousin, with the majority in the document dedicated to physical characteristics and user interface design. The wording is also quite willowy and lacking in detail. It is certainly an area where I think there is a lot of room for improvement. Also, because implementing EFB security is the responsibility of the operator, some manufacturers appear to use this as a green light for poor implementations of technology, which on their own would be considered zero security by design. Instead, the security is designed to be assured by the inherent assumption that the operator's networks will never be compromised or that malware will never find its way onto an EFB, kind of like passing the security hot potato. At the same time, operators are not directly informed of any vulnerabilities, instead given a set of recommendations to implement. Unless the operator discovers any weaknesses, they would be unaware of the risks involved, and if they were aware of the risks, I'm more than certain that they would have targeted controls to them based on what they knew. Overall, though, the recommendations given by some manufacturers are pretty good, although I feel there could be more detail in certain areas. I would like to share the document names and references with you. However, I am heavily constrained by NDA, as well as manufacturer efforts to protect IP, so I'm not allowed to give you these names, even though doing so would be for the benefit of any operator's watch. And I cannot move on without briefly talking about Class 3 or integrated EFBs, and even though they don't form part of the AISB, there are some use cases here which highlight a really important issue when it comes to technical debt. So, like I said before, the hardware is fixed and the software updates come from the manufacturer. So, to an operator, it may seem a very attractive prospect in terms of time and cost to get an EFB rate. However, this category of EFB is not without this issue. And an example is an aircraft which I love, which is the Dreamliner, having flown it many, many times, albeit not that long ago. I only started flying it about three years ago. I absolutely adore this aircraft. It was great to fly on as a passenger. But many Dreamliners have, especially the ones that first were brought to market, have Windows XP operating system integrated in the EFB, which sits in the endionics. And this is a fact which gets a low-surprise look from infosec and IT questions, but hang on. Remember, the 787 was designed in 2002-2003. Back then, XP was king and would be for many years to come. The type of approval and first flight for this aircraft was in 2009. Vista was available, but come on, Vista, who would have changed? So, when that aircraft was first went to market, it was delivered with EFP-EP4, developed using Windows XP. EP5 is available now, which uses Windows 10. However, not only do the upgrades come with a huge cost in terms of the unit, but also a huge cost in terms of downtime and recertification of the endionics, because it's a hardware change, as well as a software change. So, this kind of explains the issue of technical debt using more modern examples. Of course, we all think that an EFP onboard a brand new modern state-of-the-art aircraft should be the most recent stable version of an operating system. But even in the timelines that we're talking about here, which are very small, we're seeing out-of-date operating systems being used in critical systems because of these timelines. And this issue is not limited to one manufacturer. I just brought up Boeing because of the knowledge I have of the EP4, EP5 versions and the operating systems. And it's not even limited to EFPs. It is all across the aircraft. I just wanted to discuss the technology trap. To those who may not fully understand why we see old tech in these brand new modern state-of-the-art aircraft. But it is interesting to think that the first Boeing 787s still have 20 years of life left in the war more. And when they come to end of life, will they still be running EP4 with XP? I think some will. And we'll see if Ken is still, but for Ken from PENTA's partners is still climbing around aircraft in breakers yards in 2040. Then he may come across Windows XP inside the Adionix, which will be a really, really cool thing to see him play with. But if you haven't seen their videos, they have some great walk-through talk-throughs on 747s, which I'll go through the scrapyard today. And they also have older videos on their blog about dismantling old IFE systems a couple of years ago, which have a 386 hardware inside them. And these are aircraft that were only dismantled a few years ago. So remember I said that you know what aircraft are flying data centers with half of it on the ground? Well, this is your half ground system. And they're used for keying and rekeying the PKI on the aircraft. They're used, like I said, for the key thing is the uploading of LSAPs and FLS. Upgrade software modules on board the aircraft, a major, major time saver. But they also use the configuration changes, downloading of logs, and as a conduit for EFB connections. And this is done with some software provided by the manufacturer in a client-server relationship. The idea is that the operator would implement a Windows server, which runs the server software, and then the client will be installed on the engineers of that time. The PKI onboard the enabled aircraft is typically used to protect the VPN connection between the aircraft and the ground system. It is also used to digitally sign the software that gets uploaded. So integrity checks can be done before it is installed in the aircraft. And this is one area that manufacturers seem to have gotten right with the latest enabled aircraft, at least the two main manufacturers in a way. While I do not have a PhD in cryptography or anything higher than a BSC in any subject, for that matter, the PKI implementations I have seen are simply excellent. And that shows the manufacturers are learning from previous issues. And that's great to see. As well as EFBs, I think that ground systems require special consideration when it comes to security control implementation. The enabled aircraft being sold on the market today are truly excellent. Whether you're a pilot or a passenger, they are fantastic creations and a true credit to the teams involved in their development and manufacture. However, the same cannot be said of some of the supporting software. Some of the software distributed with modern aircraft is quite old, and they were likely coded at a time when there was very little consideration given to secure DevOps or even cybersecurity at all. It seems that the approach back then was so long as it works, it's fine. And that has led to some very insecure software being used with modern-day aircraft. However, I'm seeing positive movement in this area. There are new versions of EFB and ground systems software being developed today, which should be available in a few years' time. And that is going through what appears to be a sensible and secure approach to software development. So time will tell what comes out from that, but again, it's another positive move. As with EFB software, how many operators have looked at the ground system software from an attacker's perspective? Have they looked for weaknesses? Have they found any and then combined this with their threat model and overlaid on the existing network architecture? Remember, like I said, a lot of airline networks have any, if at all, any segregation. So if the corporate network was compromised, could someone pivot into the ground systems and put your aircraft at risk? Also, conversely, could this software, at any weaknesses you identify, introduce new attack vectors or vulnerabilities in your systems which could then be used to pivot into the corporate network? These are things that operators really need to consider and look at. Now, I have seen some manufacturers and third parties offer ground systems, a ground system service hosted in the cloud. And now, while I understand moving to the cloud is not the sort of side of wallet for any organization, although some seem to think it is, from my experience, these cloud solutions are normally put together very well from a security perspective. They also permit creative design in methods to mitigate weaknesses in the software. So, for example, if you had issues with the client software that would sit on the engineer's laptop and you didn't want that interaction with the server going across the network, then what you could look at is putting both the server and the client in separate lockdown enclaves inside a cloud solution, then implement a VDI to host the client interface through a secure HTML5 web front end. And that gives you lots of opportunities to customize the security according to your threats and risks and all sorts of other opportunities as well. So, these are certainly ideas that we're looking at because the software that's available today doesn't seem to be that good, but some of the cloud offerings are flight to the panels. So, these are interesting from a side of perspective. So, anyone that deals with industrial control systems will know that these are essentially what's known as a HMI, a human machine. And these are used for lighting, air control, announcements, door status, lighting, emergency buttons, and they might also have the spread features that the airline will implement. They can also be used for maintenance and for uploading FLS, LSAPs, and for downloading logs if the maintenance port isn't available. So, why do FLAPs have a security focus? Well, like any HMI, especially those ICS implementations, they're typically an easy target, especially those within the USB port. I did some research on FLAPs, and while some are directly connected to both the AISD and the PISD in the United Arabian Aircraft, there are none which have a connection to the AISD or EDONIX anymore. Not that I can see it well. That said, I did raise concern about the possibility of a passenger inserting a USB stick loaded with malware into the FLAP USB port. And what that would mean. So, the team I was in started asking the manufacturer some very probing questions. And to their credit, the manufacturer provided a highly detailed and impressive response. And it was actually done in person due to the sensitivity of the content. But what they showed us were that the technical controls to protect the ACB from the FLAP were very robust. However, when asked the further questions, it seemed that the AISD was considered almost expendable in the malware on board scenario. When asked about the response was that the AISD can be entirely functioning and it will not affect the airworthiness, which can be interpreted as the AISD can be riddled with malware and the aircraft can still be operated legally. Finally, however, that got me thinking to the operational side. And I wondered how many pilots knowing that an aircraft is infected with malware yet still legal to fly would actually fly it. What if, as an attacker, my goal was not to take control of the aircraft but to affect the operations of an airline as a coordinated extortion or blackmail attack. Now, everyone in the industry knows, regardless of the airline, that this USB port is off limits, even to crew. Yet I have spoken to cabin crew from multiple airlines and they have confirmed that they are told this in training. However, everyone in the industry also knows that this port is used extensively to charge crew devices, both official and personal, in spite of the rules. I have also seen passenger devices being charged in the FAP as the USB port in this scene wasn't working and it was actually the crew who had done this as a courtesy for the passenger. So what is clear from this is that while people know there is a risk, it is clearly awareness training isn't working, not all the time in it. I thought it important to also include some detail on some of the more general issues I found and I must state again, I am heavily bound by NDA but I have reached out for advice and agreed parameters of what I can divulge here prior to this talk. So one team I was in had been working on the security design of their ESDs and we have planned project milestones to ensure that build documentation was accurate and that the device was built according to design. A lot of work had gone into locking the operating system down, it was windows based and this is where I had a lot of extensive experience and knowledge. So this is where I focus my attention. I knew that we had a pentester who was going to look at it from the perspective of an ass hacker and oversee my work so I left the software alone. This is the manufacturer provided software. I had naively assumed that this software would be fine or at least not too bad. When it came to the pentest I knew the build was solid so I was surprised when on day two of the test, day two of the five day test Andrew pentester called. Now anyone that deals with pentests know that if they call you prior to the planned end date and it isn't a question then it's bad duty. Andrew had not only found an API that was completely unprotected, there was no authentication, there was no encryption but had written some code but also written some code to export it. It was quite short so I sent a message to the manufacturer straight away and we see the response that this was not a vulnerability but in fact a feature. All I can say is that when I repeated this response to my colleagues, jaws across the room dropped the floor. I took a step back and realised that in order to make the manufacturer understand I had to do more analysis. Now back when I was in the British Army I was taught to analyse attack scenarios by asking so what and some you reached the answer. So I took the same approach with Andrew's code and turned the dial to 11. I found every single parameter I could. At the end it would not have taken much to turn that resulting code into a very feasible malware for EFBs. I spoke with flight operations, pilots and EOSEC and demonstrated the rather trivial attack. Everyone I spoke to agreed that this was bad. So despite this feedback which included a dossier of potential outcomes which I had compiled from pilots and operations feedback, the only action taken by the manufacturer was to reduce the attack surface. They replied in their opinion the scenarios I had described were not feasible and the whole vulnerability was dismissed. Now it wasn't like I was proposing aliens would come down and attack planet Earth although 2020 isn't over yet. One threat which they dismissed as unrealistic was the insider threat. Now I found that really strange because this threat was on the operator's risk register and I had been asked specifically to consider this threat when giving my updates to high level security boards. Consequently sometime later however I was informed that the software in question was going through a complete life cycle upgrade and that the project would include not just a secure DevOps approach but security milestones with qualifiers as well. Now I cannot take full credit for that change because I know that somebody had moved into a new role covering the security of such software which means that they had already knew that there were issues and that they were moving to close the gap when I had spoken from about this issue with the APR and I know this person from my dealings with the manufacturer and I'm very confident he would do good work. Further investigations I've made on other aircraft having covered examples of the same unprotected API different API but unprotected all the same and these are inside the AISD so they are accessible from the EFB they're accessible from other systems on board and despite going through the responsible disclosure route as an independent researcher I was not only dismissed for making unrealistic assumptions but I had to fight for updates. Now the assumption I made was based on a proposed commercial offering from the manufacturer coupled with their own API documentation detailing how it would work yet it was still considered unrealistic. Now they may have made a separate assessment themselves and started going down a different more secure pathway for the release of this new functionality which would be great but I was not made privy to the details so I have to say that for now I don't know. If I do find out more I will certainly update you. So hopefully that has given you a sufficient introduction into a network domain that is often overlooked by security researchers but which has a lot of juicy issues that need to be addressed. Like I said attacking the adionics are really interesting but getting access to them to exploit vulnerabilities is pretty difficult. While the AISD may not be as sexy or interesting on the surface when you scratch a bit deeper there's a lot to decipher. Touchpoints while still quite difficult to exploit are more accessible. Considerable traditional to an attacker and this means everyone from cabin crew to engineers, flight ops and infosec all have an important role to play in securing enabled aircraft at the operator level. Some advice I'd give to operators is do not take the security of provided software for granted. Now obviously I'm not talking about FLS and LSAPs here what I'm talking about is the software that is provided to connect to your aircraft so the EFBs, the ground systems etc. Even the most modern aircraft today have some very basic vulnerabilities in the code. What is more worrying for me is that this software will have achieved some kind of approval in spite of these weaknesses. So don't assume because they have approvals that they are 100% secure. Make sure you perform some kind of assurance on software and systems either using your internal teams or if you don't have the skills in-house hire an external expert or pen tester. It may be costly but having an aircraft AOG for several weeks while you investigate a cyber attack and have to replace all the onboard FLS or LSAPs could probably cost you a lot more. Security in the operations is not the realm of one department alone. The multiple business areas that make up the operations require that infosec understand aviation and what cyber risks actually mean to aircraft. Infosec also need to make sure that engineers and other business areas understand the risk context. It is a two-way street. I've seen that some operators have implanted a multi-discipline groupings which handle these challenges. They're often called a cyber-inputing cell which is a really cool name to have on your CV, just a little hint there. So these groupings often have at least one board member and they combine knowledge and experience from multiple departments and they ensure that issues are tackled in the best possible way for the business. From what I hear they are extremely effective and I would certainly recommend that you if you don't have one or the implemented new business and you operate a new aircraft that you certainly look into it and I suppose it finally goes without saying that you must involve your regulator as much as you can. They have access to the right people, the information and the frameworks with the right advice and the right guidance to guide you through difficult scenarios. So make sure you engage them as much as you can and if you don't get satisfactory results from your regulators, sure, pop me a line, I can certainly help where I can. But finally I'll sign off with a small piece of advice for operators. If you have someone in your organization who isn't as effective and competent an engineer as they are with InfoSec and cybersecurity, my advice would be to make sure that person never leaves your business. I'm not an engineer. I was lucky I had access to excellent engineers who were able to communicate to me exactly what the problems were and to better understand systems so I could evaluate the risks. At the same time these engineers were, they have a high level of intelligence engineers, so they could understand what I was telling them and together we worked really well. But if you could incorporate that into one person, that's a very valuable asset indeed. So that's the end of the presentation. Hopefully that's enough information to weight your appetites on the AISD. There is a lot more there. I have to cram it into 50 minutes. So I will be available for questions in a session in the Discord channel from where you linked to this presentation. Please head over there and join in the discussion. Alternatively, you can grab me on Twitter or on my website. There's an email link you can get on me and I hope to speak to you all soon. Take care.