 Okay guys, thanks for coming to the Biohacking Village. If you've been here for earlier talks today, thank you for coming back. It's your first one, welcome. We have, if you guys haven't, visited the CTF down the hall. It's a pretty impressive array of technologies. I suggest you guys go over there and try to break some things. Next door we have classroom stuff where actually we're assembling some prosthetics at the moment right now for somebody who gave a presentation earlier. And we also have the wonderful speaker track here where we're sharing a lot of information from some of the great research going on in the community. With that, in between talks, we also have some donation work here. As you know, this is a non-profit. The work done here and the money collected here is both to help fund and do the process and projects of these things around here. In exchange, we have some things that will give you as being a participant in that. So I'd like to introduce Dr. Avi Rubin. He's a professor at Johns Hopkins University. He also serves as the technical director of the JHU Information Security Institute. He's the founder and director of the JHU Health and Medical Security Lab and is working in advancing medical device and security and healthcare networks. His primacy there. Who also introduces you to the co-presenter, but I want to give you an opportunity to welcome you here to speak here at the Biohacking Village. Thank you. Okay, hi everyone. It's nice to be here. This is my first DEF CON. I'm used to going to academic conferences and the first thing I was told when I showed up here and I presented myself as a speaker, I was wearing khaki pants and I was told you can't speak like that. This is a hacking conference and you have to change into jeans. And so I did. I follow rules and I changed into jeans. What's really weird for me about DEF CON is I'm really not used to being the least geeky person around because in most of the circles that I go, I'm the geek but there's a whole nother level here and I'm enjoying it a lot. As the introduction mentioned, I'm the director of the Health and Medical Security Lab at Hopkins. When you're a professor at a university, you get to create your own lab and you get to name it. And I love boats and ships and I wanted a lab that was like HMS lab, right? Does anybody know, you guys look like an educated crowd here, what HMS stands for, why ships are like the HMS of this? Her Majesty's ships. That's right, Her Majesty's ships. So you won't find American ships. We name our ships after presidents and states but a lot of the British ships are the HMS of this and that so I'm the HMS lab. It's the Health and Medical Security Lab. And let me give you a little bit of background about how the work that I've done, how I got there and that'll kind of segue into the technical work that we're going to present and have a co-presenter that I'll introduce in a minute too. So I am a professor at Johns Hopkins and most of you probably know Hopkins as a medical school and a medical facility in a hospital. In fact, when I meet people socially like the parents of my kids friends, they just assume I'm a medical doctor. They'll come talk to me about their ailments and things like that. It's just, you know, if you're at Johns Hopkins there's this kind of force to work on medical stuff. So I started out at Hopkins in 2003 working on electronic voting security. I did that for about seven or eight years but when you're at Hopkins there's this sucking that happens towards you. I'm not saying like we suck. I'm saying like it pulls you like the Ross Perot kind of sucking that it pulls you into it and you have to work on medical stuff because the doctors will come around and say, hey, I've got this problem. Can you help me and their medical record systems? And I'm a security guy. And so pretty soon I found myself working on medical stuff. Created the Health and Medical Security Lab, started training up students and we started looking at all kinds of things. One of the seminal things that happened in this field is a paper at the IEEE Security and Privacy Conference. We call it the Oakland Conference. It's the number one rated academic conference. By some of my colleagues, Kevin Fu, Yoshi Kono and it was where they had taken a defibrillator, wrapped it in a bunch of meat so it would simulate being inside of a person and launched a bunch of attacks against it. And it kind of woke up the industry because the press got very interested in this and it just got a very high profile. And so the field, I felt like the field of health and medical security, at least for academics, really took off after that paper. And so I applied for grants and was part of a group that got a $15.5 million grant from the ONC, which is the Office of the National Coordinator in Health and Human Services to do health and medical security research. When that project ended, we got a $10 million grant from NSF to do the same. And we worked on projects like encrypting of medical records using attribute-based encryption and we used performing role-based access control via attribute-based encryption. That's the kind of stuff that I like. So I am a geek, I do qualify for that. And we also looked at medical devices. And in my lab, I had students that we were getting our hands on medical devices. They were flashing the ROM on there and then analyzing the firmware. And so we developed a set of expertise and skills in that and published a lot of papers. And then some of those students from that lab came to work for me after when I started Harbor Labs, which is my company. And Harbor Labs started because I was getting approached by a lot of medical device companies once I started publishing in this area, saying, you know, we would like some help, some consulting help with the security of our systems. And I have all these really smart students graduating and light bulb went off. I said, why don't I start a company? We'll do this type of thing professionally. And so that's what we do. So that's the long-winded introduction of my background. And Mike Rishanan is here, put your hand up. Mike was a PhD student of mine who finishes PhD in the HMS lab and is now the director of medical security at Harbor Labs. And he and I are going to co-speak today. So I also brought him here in case you guys ask any hard questions, direct them to him. He's more likely to be able to answer them. So this is kind of the high-level outline of the talk. I'm going to talk about what happens when you have a medical device and you care about the security of it. And what should you do if you're building a medical device and you want to get through regulatory certification, you want to make sure that your device is actually secure. And what I'd like to do is base the talk on a tool that we built and show you the various steps that this tool can take you through. And then Mike will take you into some of the case studies of actual projects we worked on with clients who built medical devices and ran into hiccups when they tried to get certification for them and show you how we help them solve those problems. So I like to ask the audience who knows what things are because that shows me the audience is alive. So who knows what a bomb is, BOM. Wow, okay, we have an audience that is way above the level that I expected for this talk. For those of you that didn't raise your hands, it's a bill of materials. And why is a bomb important? Well, let me get into talking about firmware. So if we're looking at a medical device, the software that runs it, the logic behind it, is going to be in a firmware file. And if you're in a company that builds these things, which I think many of you are, you know that you can have a team that's building the software and then you have someone on the team that puts something in there and then they leave the team and somebody else comes in and there's dynamic kind of flow of people through the development of a project that lasts many years. And there may not be anyone in the organization that actually knows everything that's in this firmware because a firmware image, it's usually a Linux distribution, it's got a lot of open source packages in there, it's got libraries, somebody finds a tool and they say this would be a great tool in our firmware, there's a web server in there, they put that in the firmware and then the firmware is kind of a living and breathing thing. And if you go to get certified for the medical device, the regulators are going to say, okay, what's your bill of material? And you may not know, you may not know everything that's in there. And so one of the things that we did is built a tool that will automatically extract, it'll unpack a firmware binary and automatically extract all of the materials that are in there, all of the software that the firmware uses and that's actually not as easy as you might think because you're talking about binaries and the binaries may not have necessarily a string that has the name of the package that you're using or the version number. And so we used all the material available to our disposal, both external to the firmware and inside the firmware and did some fuzzing matching and came up with what I think is a pretty good system for figuring out what are all the pieces that are inside of this firmware? And why does that matter from a security perspective? Well, there's a national vulnerability database that has these common vulnerability and exposures, which are called CVEs, that are vulnerabilities that are known for various software packages. So let's say that your firmware in this medical device has package X, it's a software package that does something that the firmware wanted to do. It's not something that was developed by the manufacturer, it was something open source that they found and version three of product X has a serious security problem in it and it has a CVE in the national vulnerability database. And so what the tool can do is when it goes through and unpacks the firmware and it finds that package X, version three is in there, it can say, aha, there's a CVE associated with this firmware. And if that thing gets upgraded someday to version four, maybe there's no CVE for that because the problem was patched. And so it's important to get a look into a firmware to be able to know what are the CVEs associated with this firmware? Now, let's see how big this is coming out on here. One of the things that you can have when you're trying, for example, to get a medical device through certification is there can be disqualifying problems. Like, for example, if you're using a weak algorithm, this says weak algorithm detected and that has a CVSS score, CVSS is the common vulnerability scoring system, I think, and that is a standardized way that the industry takes certain problems and scores them. And so this is something that you can identify in a firmware file and say, uh-oh, there's something here that's gonna disqualify me from passing, I need to fix that, I need to put something better in there. And so here you can see that the tool can look through and see all the various CVEs and all the packages, and this is what we call the pre-market CVEs. That is, you package up your firmware, you run this through it, you sell your device, and you get your score. It takes the worst score of all the CVSS scores that are found in the device and gives you that as your overall score for the firmware. However, a firmware file, believe it or not, in terms of its security is a living, breathing thing. It's not static. Why is that? Well, imagine that you put out your firmware and it uses library X version three, like I told you before. And you're the developer of this firmware, so you're like, oh, there's a security flaw in package X version three, I'm going to swap out and put in version four in there, and now there's not a security flaw. So the product is out for a while, everything's going great, and then some DEF CON researcher looks at product X version four and finds a really serious vulnerability that nobody knew about. And so a CVE is issued, and now you're sitting there with a package, a firmware package using that version four that is believed to be pretty secure. It's got a really good CVSS score, and it's actually not, okay? And so what's needed is a way to constantly check the CVEs, whoops, let me, got a little ahead of myself, to constantly check the CVEs of all the packages to see if there are new ones. But what our tool does is every night, it downloads the NVD database and then looks at all the packages and all the firmware that people have analyzed and see is there a new CVE for one of these things? Let me tell you another way that your CVSS score can be misleading and where you need to stay on top of it. One of the packages is to look at crypto, to look at the various settings, the algorithms, the parameters of the crypto that's being used in the firmware package. And one of the checks is made to see if a certificate is expired. If the certificate's not expired, that's not going to hurt your score. But what happens is time moves forward and the certificate is expired. And so if you don't keep on top of it and you don't keep checking it, you could end up with a package that's got security problems that you don't know about. So let me now, before I hand it over to Mike, he's going to talk about a couple, I think three or four examples, how many? Three examples of actual security work that we did with medical device manufacturers with some lessons learned from it. And the first one is about patch management. I just want to say a couple of words about patch management. You all know what patch management is. It's the ability to take a software package and get new patches to download for them. And there's always the question of, when do you patch, right? And is the patch going to break my software? Because I've got a system that works really well and now the vendor issues a patch and you have to ask yourself the trade-off between well, if I don't patch, I'm vulnerable to this vulnerability that the patch addresses and everybody now knows about it because there's been a patch. But if I do patch, it might break my system, it might actually introduce new vulnerabilities. And so system administrators have the unenviable task of figuring out how to manage patches and when is the right time to patch. If you think that that's difficult, consider a paper, I believe it was authored out of David Wagner's lab a couple of years ago in one of the academic security conferences where they talk about, I think the title of the paper was something like automatic exploit generation from patches. And so think about what's going on here. You've got a software package and somebody finds a vulnerability so the vendor issues a patch and releases it. What this tool does, it's a security tool but it's one that hackers could have as well or could have built and not told anyone is it looks at the patch and based on what the patch is doing and based on the original binary reverse engineers the vulnerability and then builds an exploit to take advantage of that vulnerability in milliseconds, it's all fully automated. You take a patch and the output of their tool is an exploit that takes advantage of the vulnerability that the patch was addressing. Now you really don't have a lot of time if you've got like an online system to get that patch in there really quickly. Okay, so with that, I'll hand it over to Mike. All right, thanks Avi. Hey everybody, I'm gonna talk about some more stories as it were. We're here with some of our clients with respect to three different scenarios. So Avi gave a great introduction to patch management. In this particular case, we were working with a medical device manufacturer. They were going, it's a class three device. So this has the potential to cause great patient harm and they were going through a PMA evaluation. So that's working directly with the FDA to review their protocol to see if all risks are mitigated and this includes cybersecurity risks of course because we're cybersecurity experts and that's why we were working with said client. Now what had happened is when they approached us, they described their current patch management. And so let's go ahead and diagram this out. And just note, I'm using medical device here sort of arbitrarily. We love and respect our clients. We would never use their names, so please don't ask. Cool. All right, so the original process, we have medical device. It's sending out a patch request. You'll note that there is this manufacturer's CDN. So this is a public cloud. I'll oftentimes interchangeably say AWS because it's typically the cloud that I work with and prefer. So it will have more details in terms of why that actually matters in a bit. So medical device makes a patch request, goes to the cloud and the cloud is going to send a patch. Well, it's an unsigned patch. Huh, so what could possibly go wrong with an unauthenticated patch? Well, I wish the text didn't overlap but it's subject to tampering, right? And in this particular case, when we were working with the manufacturer, we had to think about what the threat model is and who are the attackers, right? So in this particular patch request, I haven't even said a set of it is a secure and authenticated channel. So let's imagine that it's not TLS to begin with. Well, if you have a passive eavesdropper, what's going to happen? You're gonna see the raw bytes. And as Avi had alluded to, when I go about security analysis, the first thing I'm going to look at is, well, if I could capture some of those patches, I'm gonna find out what libraries are changing. I'm gonna find out what binaries they're actually changing and updating and that will inform how I do and perform my analysis to break your device. And let's not be fooled, any attacker is going to do the same thing. Now, another issue with this is, is just to assume that the next more powerful attacker is an active attacker. Well, an active attacker can modify, jam, or drop this traffic, right? And so this becomes problematic because that sort of higher level of an attacker, if they can just change bytes in that patch, they've effectively controlled the software that's going to be delivered to your system. If your particular medical device doesn't actually verify a signature on this particular patch, well, then you're gonna install it haphazardly and be very happy and lo and behold, you're running a code that an attacker directly controls. I should also mention, I mean, we're talking about patches here, but it is the case oftentimes that this could be an entire firmware. And so it's not just some of the functionality. And if anyone doesn't believe me, if anyone has like Windows 10, for example, and your update process is like 20 gigs because it's putting the whole entire operating system on there again. And so this is pretty big. And so what we have to do, we have to talk to the manufacturer and say, all right, well, what do you guys care about, right? And we of course mentioned this issue. We say, well, do you care about confidentiality? Like, you could potentially have IP in this firmware for this device. And they said, you know, we're not too concerned about our IP, which is a weird statement, but it was fine. But what they did say is, yes, being actively able to modify is a big, big problem. How would you guys go about solving this? And this is where we start collaborating together. So first solution we come up with are we sort of brainstorm through? Again, we have our patch request. This time, let's just assume this is TLS and we'll get into that for a second. So we at least have a secure channel between, say, medical device and the cloud. And in the cloud, what's happening now is that the cloud will have that patch pre-existing somewhere. So if this is AWS, it'd be an S3, for example. What the cloud's going to do, the cloud actually can access a hardware security module. And so for people that don't know HSMs, I'll do the brief 101 as fast as I possibly can. Crypto device, tamper-proof, tamper-evident. Secure key generation with random number generator that's been, it's FIPS certified. If you're an academic like me, you might really not put too much cloud in the FIPS. Everyone else is from the industry. You love FIPS because people and regulators tell you it's good. And of course it is. The appropriate algorithms, the appropriate ciphers, the appropriate key sizes, and make sure you don't make any funny mistakes. No one wants to see you implement a Caesar cipher because that's not crypto, you will be broken. And so this hardware security module exists. Typically they're clustered, which gives you a great benefit in that if one fails, you have encrypted contents elsewhere. Because if you delete, say, private key, for example, which is non-extractable, if you lose a hardware security module, well, guess what? You've just lost the ability to perform any of these signing operations again and you have to reboot tracker system. So all of this is great. I thought this was fantastic. Yes, use the cloud. You should use hardware security modules. And so we send a signed patch and we look at the medical device and we say, oh no, guess what? This is still subject to tampering. Guess why? We were talking to the great FDA reviewers and regulators and I'm not saying great, just kind of cheeky. There really are super smart people to work with and one of them had mentions like, well, all right, where's your trust boundary? I'm like, well, you know, I trust the cloud. I don't trust the cloud. All right, well, you're the regulator and I believe you, I too do not trust the cloud right now. And so we reworked the solution based on this dialogue because it made a lot of sense and in terms of security, it's going to bolster the security. So then we'll come up with our next sort of brainstorming approach. Again, medical device, do a patch request. We don't trust the cloud anymore. So what we're gonna do with my big head, I'm sorry, I'm very tall. We're gonna generate a private and public key pair. We're gonna do that locally during the manufacturing process. This is a secured network. It's within our trust boundary. And what's great about it is you can still use hardware security modules. They're like $25,000 and usually, I don't know, medical device manufacturers have a ton of money to throw away. So I love saying, yeah, you should get that one. And that's precisely what they did. They did the secure key generation, their side, they use the private key in that secured network manufacturing environment. They can sign a patch and they can upload the patch and the signature to their content delivery network, which is the cloud, so great. So these two things are existing in AWS and we pulled out sort of from our trust boundary that cloud. We send the signed patch across. We thought we were good. Now lo and behold, if you think about the process and during the brainstorming process, well, you know, you have an authenticated patch. But the problem here is that, well, we don't have a nuts. We don't have a timestamp. We don't have any uniqueness. And the reason this matters is well, you've opened yourself up to a replay attack, right? And so there are a number of ways to mitigate this and some of them aren't even security. I'll just blatantly tell you that and I can tell you what that particular solution is because you can combine it with actual security and it works well. So the scenario, let me just play it out this way, right? If I were an active attacker, not even after, maybe I'm an eavesdropper and I see a patch that's put out in like 2006, I saved that patch. I know that at some point, even that patch is going to have some vulnerabilities inside of it, so some software gets old, software ages like milk, go figure. I wait and I see patch happens five years later and I see exactly what changed and I look at the old patch and I go, oh, well, these binaries also changed that. I have this entire thing. Well, if there's no replay protection mechanism, what's to stop me from attacking as to replaying that original patch, overriding the good one and putting the system back into a bad state. Now, I won't lie to you, that scenario may be a bit contrived and the reason is because you can implement controls inside of the medical device that say, all right, while you're signed, I need one of those bytes to actually specify what the firmware version is and I will never downgrade. That's a reasonable approach and that's one thing that we did build in but at the same time, we wanted to go ahead and mitigate this particular risk as well. So we end up with the next patch process, patch request to the cloud, still doing our ARM premise signing and uploading a signature and patch and then we're sending a signed patch and or firmware and a timestamp which is also signed. I'll talk, we're missing some text on there but I'll explain. And now we're authenticated with the nonce and this gives us a better security implementation that way than we had before. Now, I said there's something missing and let me just walk over here real quick so that way, when the purist asked me the questions and I say, well, dude, you messed up, how'd you sign that? One approach that we can take, we have the on-premise signing of the patch or the firmware. We still have hardware secure modules that exist in cloud and so this is a semi-trusted environment. I won't say that it's untrusted because for example, you're putting the patches in here to begin with so you have to have some trust or some notion of trust. So what we can do, we can have yet again another symmetric or asymmetric key pair, excuse me and we will go ahead and sign the inner payload which is signed itself with an outer payload, bacon, that timestamp, ship that off to the medical device and what happens while you verify the outer, go ahead and pull it apart, check out the timestamp, verify the inner and there you go, we have a secure protocol. Cool and so that is the patch process experience that we had with one medical device manufacturer. Awesome. So the next thing we're gonna talk about is communication protocols and the reason why I bring this up is I actually talked to some of the device manufacturers next door and one of the guys we had a dialogue and I was like, wow, this is really close to our system. I was like, yeah, I've seen this in multiple places, you must be doing okay. Here's my card, you should call me. But it's okay, so we're gonna talk through this process. So let me take a step back, we see three beautiful images on here that I did not make but they look great and we're gonna talk about what the relationship is. So in this particular case, we have a personal medical device. This is a wearable, it's on you all the time. We have a mobile device, so smartphone, Android, iOS. This is going to be in communication, direct communication with this particular device and we still have our cloud component. So one might imagine, well, how the heck are these things gonna communicate and where is your trust boundary and those are all fantastic questions. Before I get ahead of myself, I'm gonna hit the right arrow a couple of times because I'm gonna mess up these transitions. Come on, is there one more? No, that looks like it. Yep, cool, they're all up there, they're just not showing there. So let's think about it this way. You're gonna have this smartphone. Smartphone's going to configure the actuation of this particular device. We're gonna make some assumptions, close physical proximity, you might imagine, well, what protocols would you use there? Bele, of course. And then there might be, this is all patient configurable, but the medical device, since it's doing actuation, it's also doing sensing, right? And so we have to imagine medical devices are oftentimes, I hate when people argue, it can't have really good robust hardware because the states at Raspberry Pi have four gigs of RAM now, anything can have good hardware. But the argument is that, well, these won't have a NIC, and I can buy that, so I might have a Bluetooth Bele module such that it can talk and some driver around it, but it doesn't have Wi-Fi, so it's not connecting to your wireless routers, which also makes a lot of sense, too. How would you, for example, have a wearable and connect to the Wi-Fi for me? It's kind of a contrived thing that you may have to do. So what we need in this particular architecture is that this personal medical device has to communicate to the cloud over the smartphone. Now, I can tell you, you're all hackers, I bet you a bunch of you actually have lineage OS or something equivalent, your phones are probably rooted. Don't trust the package managers because they're signed, of course, Cydia signs all of its packages, but who the heck uploaded those things? And so from the perspective of the medical device manufacturer, that's not trusted. This should never be trusted. It's not even semi-trusted. It's great, you can configure your device, it's on you, it's cool, we run an application, but if there's any personal health information or PHI, don't trust that particular device to look at it because it could siphon it off, send it somewhere else, right? Cool. So now we're gonna look at, so let me draw all these arrows out, too. There we go, that way I don't miss any of them. I think that's pretty good. Right, so what would you do then? You have a BLE connection. Well, if you know anything about the BLE specification, there are maybe four, I think it's four, it might be five, don't quote me. There's authentication mechanisms that you could do at the transport layer to secure these two devices. Well, requirements in design actually lead how we think about our risks and our controls and what the system actually ends up being. And so in this particular case, it was, well, that's great, I mean, we can have pairing protocol, we can have a pen, but that seems arduous. And actually, we don't want it to impact the patient experience. We want these devices just to work. So the anecdote here is if anyone's ever had a Bluetooth speaker and you go to connect to it, it's somehow your friend's phone starts playing to it and then they disconnect, someone else disconnects and the whole thing just turns off. It's because, well, it's kind of garbage and everyone hates trying to pair it to these devices, right? So in this particular case, what we had come up with was a scheme where we're gonna end-to-end encryption and authentication between these two devices because, well, the manufacturer has access to this device when they configure it. They have access, they put the boards in, they put the plastic pieces together and maybe they 3D print it. And that gives us a lot of power in terms of how we might write a cryptographic protocol. Now I'm saying write the cryptographic protocol, don't misquote me. We're not inventing AES 2.0 because that's just silly and it will be broken. We're using the ciphers and the algorithms and key sizes that are recommended and mandated by standards or using them an intelligent way to create a hybrid scheme that just works. Now, since I said that we're creating our own cryptographic protocol, the most fantastic part was to be able to use that same protocol from this device, talk to the cloud, over the phone, without the phone seeing the data. And that is what we actually came up with. So let me just hit next to see, I did miss a bullet, that's fine. That looks like it's okay. Yeah, you need to be connected before you go to the cloud, that's good. And let me see, yeah, so I'm gonna, I clicked through these because I actually wanted to do them in a different format or a different order. I'm gonna talk about the personal medical device and the cloud first, all right? You were in the configuration stage. How might you bootstrap the security? Well, one way is you could take the medical device as you're putting it together. You want to do secure key generation, right? So let's hope, I mean, modern architecture, if we're doing, what is it? If we're doing ARM processor, there's things like trust zone, crypto co-processors, those types that you can put into this device to securely generate your key pair. What that means is that you have a good sense of your security protocol and how these things are generated. They're unique for every device. Never, never hard code a key because you'll get owned and then you'll end up at a talk here and everyone will laugh at you. So you want to generate your key this way. You're gonna have a key pair, it's asymmetric. The cloud also has its own asymmetric key pair and what you're doing is that you're in the trusted manufacturing process. You do key exchange, you exchange two public keys with one another and this is going to allow you to sort of bootstrap the end to end encryption process. And glossing over some of the details, we might argue like, oh well, asymmetric key pair, one's for authentication, one's for encryption. Yes, it's also a hybrid-baseding approach so there's going to be a symmetric key. I won't talk much about that and I have three minutes so I should go really quickly. But the idea is that since you bootstrap this, once you go to put these things online, you're going to have a pairing protocol with that phone. What that phone's gonna do is first, it's going to pair to the cloud as a part of the process. They're gonna generate a key pair, it's going to change public keys with the cloud. The reason why you're doing this is because the public key that gets inside cloud, you want to sign it. The reason why you want to sign it is when this pairing protocol happens between these two devices, you need to verify that that phone's actually trusted. Like, do you have an account? Do you own the device? Did you talk to the cloud? And once that is done, since you've already pre-established the secrets between the cloud and the medical device, lo and behold, you can do end-end encryption. So you can do, for example, an electric curve, Diffie-Helman key exchange. Let's make it perfect forward secrecy in it and we can encrypt data, send it back and forth. It can be PHI, PII, and the mobile device is none the wiser. Yep, and we get a giant green mark, which is great. So I'm gonna brief you this one for three minutes. I think I can do it. Well, he said three minutes. Yeah, I'll be quick, I promise. All right, for this one, medical device. The issues that came up were all access controls. So physical access controls, maybe network-based access controls. The problem is, you know, you might only have a single set of access credentials. Your admin, your root, you have access to everything, that's probably bad. Unsecured protocols, such as FTP, Telnet, don't want those. Anyone can easily drop on the data and mess with it, not good. USB port for maintenance and configuration, well, this is just another attack surface that someone can tap into. Direct access to therapeutic controls, well, you know, if an attacker has physical access, they're probably gonna break it with a hammer first, but we like to mitigate as many risks as we possibly can. So maybe we put a pin to authenticate the user to touch the control pad. And finally, just separating clinical mode and operational mode. When this thing is doing its business for the patient, you shouldn't be able to do a firmware update because that is just gonna hose everything and be a whole mess of problems. Yeah, so I talked about all these, that means that slide goes quickly. So fixes, as you might imagine, there's basically common sense, right? Get rid of unauthenticated, unencrypted protocols, use SSH, for example, use TLS. If you have HTTP, looking at it, also requiring physical access, requires having a unique pen for every single device. Let's not have default hard-coded credentials and secrets, because again, if you figure out a pen and someone sets these up, well, then maybe you know the pen for all hundreds of thousands of devices, probably not a good thing. The communication functions, you want to be able to partition these by hardware, so isolated worlds. If you're doing actuation, you probably don't wanna be on the same board processor as the thing that's doing your TCP IP stack. Because guess what? Vulnerabilities happen in drivers, that's current level exploit. You get rude on the box, but you don't, again, wanna affect the patient. Lastly, like I said before, if you're in clinical mode, you shouldn't do a firmware update. So these things need to be cleanly isolated, such that one cannot impact the other. And that fixes our medical device. And I'll let Avi finish on the last slide. I'm sorry guys, I went a little over. Okay. Thanks a lot for that, Mike. So just as you saw, each of these examples showed some pretty straightforward problems and some pretty straightforward solutions, but there are so many things that you actually have to keep track of. The complexity can be really high when you look at everything together as a holistic system. So it's really not just the firmware, although the firmware is really important, but it's the entire system. I think that this industry is very lucky that it has really top-notch regulators, the people that we've spoken with on behalf of our clients, are super technical people and very smart. And unlike some other industries where the regulators maybe aren't quite up to par, here they really are good. And so that's something that the medical device manufacturers actually have in their favor. And so I think we're out of time, but thanks a lot for your time and attention.