 Welcome, DEF CON 28, the Do No Harm panel. This is a healthcare security conversation. And for the next 45 minutes followed by a 45 minute Q&A, you're gonna hear from some of the world's top experts on the healthcare cybersecurity space. Welcome. Thanks, Quadi. I'm Replicant. We're going to get out of the way here in just a moment and introduce you to our panelists. But obviously, a little bit different of a venue for us than the last couple of years. We hope that everyone is staying safe and healthy and we're thinking of you and look forward to when we can hang out together in person. Without any further ado, I'm gonna let our panelists take it away and say hi. Hi, everyone. My name is Ash Left. I am a biochemist, a computer scientist and an electrical engineer. I work at a medical device firm called Starfish in Canada. And I'm a software engineer there and an advocate for security and privacy. I think that means that I'm next. So I don't have a handle, unfortunately. So I am just a Jessica Wilkerson. I'm a cyber policy analyst with the Food and Drug Administration. And I work on a cybersecurity policy there for medical devices. Some of you may know me from my previous time with the United States Congress Energy and Commerce Committee, where I worked on cybersecurity policy for them. Very nice to talk with everyone and looking forward to the panel. Thank you. Hi, this is Vee. Everyone should know me. The short, the sweet, the sassy, the spunky. I'm leaving my company for the wild fields in Norway. We'll be having a whole host of minions to help me do medical device research. Currently doing independent research in Formatronic, adding them to get more visualization on their products and their security. I'm an advocate for patient rights as well as keeping things safe, making it, keeping it simple, keeping it stupid is my saying. And that's me, short and sweet. And that leaves me. My name is Vidya Murthy. I work for MedCrypt, a startup in the space of bringing cybersecurity features to medical devices. I absolutely am passionate about the subject matter and think that we're at an inflection point here and there's gonna be a big amount of change as we go forward. Well, on behalf of Quartium myself, we are just so incredibly grateful to have an amazing panel this year. And for those of us who are joining us for the first time, and this is kind of one of the silver linings, I think about this format is that in past years we've had folks who wanted to kind of get in on this conversation that weren't able to do the space constraints. So we're really thankful for the opportunity to have anybody who wants to come in, listen in and join with us to be able to do so. So for those who may not be familiar, what we're gonna do is we're gonna have a little conversation here between the four of these awesome panelists and we're gonna stay out of it as much as possible but get their insights and kind of what their thought processes are and some of the things we've been seeing in the space lately. And then part of this session will be a live Q&A follow-up conversation afterwards. So we don't really have a formal set of questions like most of these kind of more rigid panels do but I do wanna kind of start off on a little bit of a brighter note. I think some of the times that we've had this event in past years, we spend about 45 minutes talking. The term dumpster fire comes up not infrequently and then at the end we try and push a little optimism because there really is a lot to be sort of hopeful about in large part because of the types of things that we're able to do at places like DEF CON but I wanna kind of start off this time upfront by asking each of our panelists, what are some developments lately that have helped make you feel particularly hopeful or optimistic about where we're heading in healthcare security before we get into some of the challenges that still remain. So I'm gonna toss it back to our panel for us to get a little dose of optimism. Yeah, so I think looking at this year in particular we've seen more and more devices going home with patients. We've seen hospitals go online in a way that they've never planned to in 10 years from now, right? And I think the notion that we can now build security as a need to to let patients have confidence and them operating from their homes is something that seems to be top of mind for folks and actual common practice. So it feels like a confluence of various factors that are coming together at just the right moment. So we have this push of devices leading the hospital. We have this push of device vendors really getting into the narrative more than ever before and this collaboration like DEF CON brings together with researchers and devices themselves. And I just, I have a lot of hope that this year has the momentum behind it to really get this to stick and have the forces that are really gonna make this last in the future. Yeah, I echo that, but I've also seen this big push I've looked back three years where you could not even have a conversation with a manufacturer, right? It was a brick wall. This year we haven't gotten the sessions. This year we invited to the table hackers are no longer the criminals that are lurking in the darkness. We've brought things to life. We haven't difficult conversations, but we're having them. And I think that's a major push forward. But except for that, we now having developers and engineers think more security-wise that we've started to change the cultures of working in silos to start working as a collective. And one example that I've seen is the way that the community is jumping with the 3D printing, PPE, mask making, supporting medical people with their hand. And I mean, even the CTI league that's protecting hospitals around the globe from cyber criminals. I mean, that is just a collective pull of humanity showing that if we work together we can change the world one button by to the time. What are you saying, Jessica? Oh, I like the prompt. I mean, I think I started working on healthcare cybersecurity issues, I think right around 2015, 2016 timeframe. And at that point, obviously I was working on them for the United States Congress. And it was just an entirely different conversation. I mean, the things that we were talking about like the had said about let's bring in security researchers let's have them be part of the conversation. They have valuable experience and things to contribute. It was just such a non-starter and you had all these manufacturers who absolutely were never going to do that. That was completely unacceptable. We are still in some cases getting asked for help by the manufacturers to essentially be like, can you make this problem go away? And there were just other policy issues that have come up over the years. I work a lot on legacy device issues. I work a lot on self-warrant transparency issues. And they always started out as this is never gonna happen. We're never gonna be able to do this. The industry will never accept this. And in 2020, it's completely different. I work on those issues every day. They are starting to be operationalized. Security researchers are a huge part of medical device and healthcare cybersecurity overall. We've just really made so much progress. And we've made the progress in such a way that it has its own momentum now. It's gonna carry itself through. And I just think that that really can't be overstated how valuable that is. Yeah, so I agree with what everyone said. I would say that I think one of the biggest things I've noticed over the last five, 10 years is the people. So community and like the number of people sort of coming together, there seems to be lots of different groups and organizations popping up like even the last couple of years, advocacy groups, people working together, yeah, medical device manufacturers. I mean, even when I, five years ago, started having these conversations and asking questions about security and privacy, it was harder to find other people to even have a discussion with about these things. And now it just seems like there's this momentum that's really building and the number of people are just getting who are engaged and wanna make a difference and wanna be a part of the conversation is growing. And I see that as a huge positive thing because realistically like the more people that we have participating in the conversation, the more action that we're gonna be able to take sooner and faster. Yeah, I think one of the big things also as we've seen physicians coming to the table or wanting to learn more about the devices they have and work with, right? It's not just the security people driving it or policy driving. And I think the healthcare industry has come back into a bang and saying, how can we make this better for our patients? Because in the end, I think every physician wants to make it safer for their patients. And that's a big thing for me that I've seen this year is the physicians coming to the table saying, well, how can we learn more? How do we translate this to our patients in a way that they understand? Yeah, I think that, I love the conference calls. I was gonna say, I'd echo that because I think even going beyond patients and things, one of the things that my boss is really quite insistent on and you probably all know Suzanne Schwartz one way or another, is the shared responsibility of the healthcare sector where even we are the Food and Drug Administration we can only do so much. We have jurisdiction over the medical device. We don't have jurisdiction over hospital networks. We don't have jurisdiction over other parts of it sometimes that are certainly implicated in cybersecurity concerns. And so, there's traditionally been a very fraught relationship between, for example, medical device manufacturers and healthcare delivery organizations, so hospitals and others. And what I've started to see and have personally experienced over the last couple of years is a greater and greater breakdown of the barriers between those two groups. So with some of the partnership groups that exist, I spend a lot of time working with the healthcare sector coordinating council, for example, you have hospital sisters who are on the phone every day with global product security officers at medical device companies and they're just having conversations. I mean, sometimes they're just shooting the shit and sometimes they're actually talking about work stuff but like those relationships being there and being already established are just so meaningful. It allows the progress in the sector to be made so much faster and problems to be addressed so much quicker. And so, I would echo what V said about, we're having almost new entrance into the conversation that's letting the conversation really take off. I just wanna say, sorry, go ahead, Christian. I just wanna say these are all fantastic insights and I love to hear the optimism around how the conversations changed because every year when we have this event, we undoubtedly have to talk about an event that's happened during the year, whether or not it's WannaCry or whether or not it's a security researcher that's been shunned or threatened by a device manufacturer and for, I think I can say this, the first year where I have not been publicly aware of such an event. And so, I think that's really proof to what everyone's saying here is that at least part of the interaction with hackers has changed. Now, whether or not that'll stick or whether or not this is just an off year, I think we'll have to see, but when we have strong partners in this space, hackers coming together with health organizations and device manufacturers, welcoming their collaboration, everyone wins. And that's really what makes patients safer at the end of the day. I wanted to just also say though, a lot of what was discussed is optimism around, this is a great year. It's also a horrible year. COVID, like 2020 is one I would love to forget. There are so many things about this year that have absolutely sucked. One of the things that's been voiced to me by a variety of people is, how is COVID going to impact this? How is COVID gonna impact hackers, security researchers, security professionals and the momentum that we have made because there's legitimate concern, I feel, that a lot of security work, a lot of securing these types of spaces costs money and how a lot of that money is being sucked from what would be security budgets into responding to COVID. And as a consequence, do people here think we're gonna see a regression? We've made 10 steps forward or are we gonna go five steps back because COVID really stopped it, stopped the momentum. Love to hear you guys' thoughts. So I'll start, sorry, Vidya. I think it's broken the perimeter, right? It's forced us to pull up our socks, right? The healthcare perimeter is now no longer within your hospital. You have patients at varied areas and everything is different. But I don't necessarily think taking 10 steps back is a bad thing, right? This year has given us time to assess, time to view the healthcare scene, the medical device scene. And if anything forces to slow down and observe and just take note of what's happening, right? I see hospitals crumbling up underneath a pandemic which is something that their function is to care for patients. And we see that even the biggest hospitals are struggling and that perhaps we just need to take a step back and rebuild. Maybe 2021 could be the year that we build things better. Instead of trying to slap a Band-Aid on, we do things right and we do things better from the ground up. Yeah, I love that notion of taking the time to rebuild. I mean, we've seen where healthcare facilities have already been compromised before they saw their first patient as part of trying to meet the need for COVID. One of the things I worry about is in some sense, I think there's this urgency from a clinical perspective to get devices out to patients and kind of solve some of the medical-medical problems. Has that resulted in these devices having security measures that were intentionally not so robustly built in from the beginning just so they could get out and treat these patients? And what do you do now, right? Can you walk that back? Can you say, hey, give all those devices back to us? We don't wanna necessarily see it. So I think your point of needing to build from the beginning is absolutely heard, but I wonder if you're realistically able to call pause on the care that's already out there. I don't know. I mean, I always refer to it as the legacy problem, right? Literally, it's a sea of devices out there and I mean, if you look at the number of devices out there every year, it just increases exponentially. I mean, the maths blows my head. I sat in it the other day and I went, well, how do we solve this? And I think one of the solutions is to do things going better forward, right? Is not to introduce new devices with the same flaws and the same vulnerabilities and adding to our problem. Because those devices, we're knowing if you take an ICD, for example, can last in excess of a decade. If that 600,000 devices implanted a year lasting 10 years, it is an ocean that you're trying to boil off. You can't necessarily say to someone, hey, I need to cut that device because it's got a flaw in. My device has got a flaw in that I'm fully aware of. But what are the options? I have to go for 10 years with this device until it's made better. But these devices were not built with a security as a functional requirement, okay? It's a clinical requirement. And I think if we start shifting and start building and designing with security in mind, we can start addressing the problem forward. In terms of the literacy problem, I actually don't know how to boil that ocean, to be honest with you. Yeah, I like what everyone is saying. I like the idea of being able to rebuild and taking the step back to think about how we wanna solve some of these problems. I think that in practice, it's really hard. Again, with the clinical, looking at the clinical perspective, like right now we're working on trying to create a massive number of respirators and ventilators in a short period of time and make them safe and make them functional. And I, you know, how much security, like if it costs more and it takes more time to add the security, it's hard to make the argument like what is, what's more important? You know, if people are dying and it's literally minutes, hours, makes a difference between life or death and you can get one more ventilator into a room. You know, if it took an extra two day, even day to, you know, add security, let's just say for the sake of this problem, is that worth it? Whose lives should we sacrifice to add the security in? Like, it's tough. I don't know how we can sort of push for that and we're draw the line in practice. It's a bad thing. Sorry, Jason. Oh, yeah. No, no, I was just gonna say, you know, I think the way that we've certainly experienced sort of the FDA is we're doing things in parallel. So I have been very blessed in my time at FDA not to have been fully pulled into the COVID response, but I'm still 100% on cybersecurity. And essentially what that's allowed us to do is while, you know, a significant portion of the agency is all in on COVID and then figuring out what they need to do and doing some of the things exactly what you all are saying, getting devices to patients who need them. We also still have a very dedicated team at FDA looking at cybersecurity. And another thing that I was really lucky when I came into FDA, I think everybody, you know, elementary school feels like a bajillion years ago, but you know, like you like walked into elementary school and there was that one person there that who was just like, you're like, we're gonna be best friends. We're gonna do great things together. For those of you who have not met Matt Hazelett, he probably is like sick of me talking about him. But anyway, Matt Hazelett is great. And he is the essentially one of the leads at FDA for doing cybersecurity reviews of devices as they come in. And so at the same time that all of this is going on with COVID, what Matt and his team within the device review office at FDA have really started doing is they're getting tighter and tighter and tighter and looking at exactly what you all are saying. We're learning so many lessons from deployed devices about vulnerabilities that are showing up. We're learning so much about what manufacturers maybe aren't doing on the front end that are causing problems on the back end. And what Matt and his team have really been able to do is they're taking all of those things. They're taking the vulnerabilities that are showing up in post-market. They're taking the missing pieces of the process, the gaps in the development of medical devices. And they're putting that into the pre-market review process where they're essentially saying, okay, we are learning every day how to get better at reviewing devices and they're implementing it. So I think, while we're certainly seeing issue or there's just this very valid urgency about getting devices out where they need to be, we also still have this very robust mechanism at FDA in particular for making sure that anything that we're learning, anything that's coming up with regard to cybersecurity is actually making it back into our process. Yeah, which the one thing I would say, I think that the financial, which was the original question, right? What's the financial drain gonna have an impact on these devices? I think that's exactly, there's almost a compliment there, right? If we can get the financial decisions to be informed by what the pre-market is telling us in terms of meeting certain security requirements and really having that be key criteria and decisions and not just no offense to the clinicians, the clinician really likes this one brand and that's what they're going with. Having it be part of that core decision, I think absolutely will drive that change. My only decision about whether or not I use a device is how many 5G microchips I can inject into your patient? Well, it doesn't have to say AI in it. No, that's the blockchain version. That's 4G. This is being recorded, man. That is gonna be clipped. That is gonna be taken out of context. It's gonna be viral on Twitter in the next couple of days. And we're gonna hear about how doctors are a part of the 5G conspiracy. So strong work on that one. Yeah, well done. What I did wanna ask, right? So I'm gonna turn it, I wanna ask question and Jay for question, seeing as we being, Gordia, you as clinicians, right? When you are doing making a decision for your patient, right, security versus clinical functionality, how do you do that balancing act? Because you guys are unfortunate as physicians to have your feet in both walls. So from your perspective, how do you balance it out? How do you work it out in your mind? Because for us, it is security first and foremost in patient care, but we're almost more from the security perspective. I've been a patient, I've seen that side, but I'm quite keen to hear from your side what your perspective on it is. Yeah, that's a great question. And I think that COVID is really starkly forcing us to understand and choose between some of these seemingly opposite goals. In medicine, we frequently have to weigh multiple different pros and cons for any particular treatment or situation. You know, a patient may need anesthesia for a procedure, but they're also very tenuous from a cardiovascular standpoint. So, you know, what's more dangerous, not getting the procedure or putting them under anesthesia. And so oftentimes you have to say, well, we need to take both into account simultaneously. And the points that we've made about the necessity and the urgency of a situation are absolutely correct. When we're in the ICU or in the OR, dealing with these patients, you know, security is not at the forefront of our mind. It's what technology do we need to achieve the physiologic goals we have for that particular patient. When we go home and we talk to people like you and we're able to put it on our hacker and security researcher, obviously we start to be able to understand and conceptualize the consequences of some of the downstream effects, right? So it is a complete balancing act, 100% of the time. Anybody who tells you they have a perfect formula for that answer is lying to you. The thing that's been somewhat reassuring to me is we have had conversations with people that we work with in our hospitals and elsewhere where they understand that you kind of have to shoot for both. And in some of our institutions, you know, I've had people say from the disaster and emergency management side of things, hey, we're gaming out the COVID response and going on that right now, but we wanna fold a, you know, a security information infrastructure exercise into that as well. Understanding that some of these events, whether it's UCSF and the ransomware issue or some of the hospitals in the Czech Republic who have been hit during this period, the underlying assumption that people are now realizing is that you need a stable infrastructure and good devices to be able to treat patients and have the best outcomes in a crisis situation like this. So deficiencies on the security side only hinder your ability to achieve your main mission. And so it's less of then a shunting of resources from one to the other. And then how can we maximally benefit both of those aspects, which is never perfect, but it's a better way of thinking about it than a trade-off. Yeah, I just completely echo that. Like one of my nightmare scenarios is treating a hospital full of COVID patients where the ICUs are overflowing and then getting hit with ransomware, for example. So what little bandwidth we have left to handle the surge to treat the patients as best we can goes out the window when the digital tools we use, whether or not it's the electronic health record or connected medical devices or all of the above would be impacted by an attack. And so it's not just maybe a little bit of impact to patient care. It's gonna be huge. And one of the things we all like to talk about and I think we all know in this space is just how dependent doctors are on connected medical technology. Doctors, nurses, technicians, modern healthcare is exceptionally connected, hyper-connected, if you will. And a lot of doctors, we've heard us say this many times, Jeff and myself included, we haven't worked on paper charts. We've never had light boxes where we pull up a CT scan that's printed on something and put it up against the wall to read it. We've always accessed those through workstations and PAC system software. So you take what we're trained on, this digital infrastructure is connected, vulnerable infrastructure, and you add a pandemic on top of it, and then you take away the tools that we're using undoubtedly patients would suffer. So that one-two punch is definitely something very concerning. To the heart of your question, which is how do we as clinicians make that trade-off or decision about here's a more secure device, but it has less clinical utility or functionality compared to this more secure device, et cetera. So how do we make that trade-off? Honestly, I'm looking straight into the camera. It's really hard to even have any bearing on that decision. So a lot of the devices that we use every day in clinical practice, we don't choose. We show up to the hospital and there are monitors that are there because they made that purchasing decision five, 10 years ago. When you train to become a doctor, you may train for four or five years on a particular set of medical devices that you're gonna implant, a particular brand. You become familiar with that. And then so what do you do when you get out of practice? You use the exact same one. And so there are, believe it or not, there's much less time to reflect and much less ability than most people believe for us to pick which devices we use in clinical practice and to make that decision. With that being said, we're trying to educate other doctors on this. And when we do, like Jeff mentioned, they care. They want their patients to be safer. They wanna use secure medical devices. They don't want their patient's health information or even their health to be at risk. So we have sympathetic ears from the clinicians. We just lack the ability to make it easy for them. What you don't want is your doctor to have to go to 14 years to train to be a doctor and then go and have to take another year to a cybersecurity coursework or whatever to become competent. You really have to make it easy for them to make the right decision and convince them that they should tell other people in their hospital and other doctors that it's important. I think to that point, that's maybe a misperception that the expectation is that everyone in the supply chain has to become a cybersecurity expert. I think maybe that's a fatal flaw in how we think about solving this is if we need to sufficiently educate individuals by having them, you're not a medical device practitioner, but hey, I can teach you all about cybersecurity here. Like it's really hard to think that you can sufficiently educate folks that this isn't their core competency on, not to say they can't have some level of understanding what the impact is. And I think making it tangible for their work stream or function or whatever the case may be is the perfect way to do that. But I would love to hear kind of the thought around how we think about leveraging those who are experts and not necessarily trying to solve it ourselves and like going at it alone. Because I think that that's probably a fatal flaw and has caused some of the historical challenge that you're talking about when you inherit a hospital full of devices that you didn't pick. And surely as well, you know, like Jeff was saying, normally physicians are used to sort of having to balance different pros and cons of different requirements to make the best medical choice for each patient. But I think historically maybe they weren't even aware that this is one of the things they need to consider. So even just how can we raise awareness and just in the consciousness that, oh, that's something that I need to consider even if I'm not an expert, maybe I can have an expert on hand or I can read a rating for a device and make a decision somehow that way. You know, what kind of solutions are there moving forward in that direction? So I've actually spent some time working a lot with the cardiologists and their technicians in South Africa trying to just have the discussion. And I had a phone call from my cardiologist saying, we had a recall of devices. I need to explant a device, but I don't know how to explain it to my patient. I don't even sufficiently understand the engineering talk on the document, can you translate? So I think the big thing is we are using our language, our linguistics, you know, instead of making it familiar to the medical practitioners. And I think there should be that portion in the hospital that does the translation because I can tell you that was a very difficult conversation having with a 65-year-old lady that doesn't understand technology. I mean, the doctor didn't even sufficiently understand why he had to explant the device was just being recalled. And that's the unfortunate thing is they are expected to make these decisions. And as he says, he has a patient coming in, his blood pressure is high, he needs to adjust the ICD in the pacemaker. What he doesn't tell him is that his wife cooked a high sodium meal, which affected his blood pressure. And a week later, he comes back because he's having issues. Therapy has changed significantly. So those are the challenges they deal with us, you know, having to solve the clinical puzzles. And I think adding that, you know, the technical stuff on top of it, especially using terminology they don't understand is where we've been going wrong. We should be finding ways to translate it better for them. Yeah, I think, and also I think that that's a failure overall of how we've designed not only medical devices, but how we do cybersecurity overall, right? Like so many times, if somebody gets hacked or whatever, it's their fault. Setting aside corporations who get, you know, DDoS or whatever, and then claim to have been a victim of a sophisticated cyber attack. Like there are times when people get hit by something and it really wasn't their fault. There were so many different things that have been expected to now. And I think, something a professor once said to me stuck with me when we were talking about this whole thing about like, oh, like people are just so dumb about cyber. Like why can't they just be better? He's like, yeah, and all those people who were driving Ford Pintas, like what the hell were they thinking letting their cars blow up? And you know, I think that that's kind of the way that was sort of a light bulb moment for me of being like, oh, we have to design devices that don't blow up in people's faces. And you know, we have to make it as easy as humanly possible to do something securely. Like doing something securely needs to be the easiest thing. And it's doing something securely is the most difficult thing. That's not the fault of the user who then didn't do the secure thing. That is the fault of the designer who made the device poorly. And so I think, you know, we've gotten away with frankly in a lot of cases being really lazy and being an offloading the responsibility onto other people. And I think what we have to do especially within the healthcare sector is all parts of it. We have to sort of reclaim the responsibility that has been ours all along and actually really do what we need to do. Yeah, I mean, the responsibility, I'm very big, the responsibility to build these things better is the manufacturers, right? They're the ones that have the control over the firmware and the hardware. We shouldn't be pointing the finger at the hospital saying, hey, your device, your network got hacked because you have vulnerable devices. Well, those should have been built better, right? Because now we're leading the industry and now we're putting band-aid on, right? The vulnerability is disclosed. Now we have to run and we're always on the back foot fixing stuff. But the manufacturer should be stepping up and should be doing a better job at us. Because it's the responsibility. I completely agree how there's a lot that the device manufacturers can do, should do, and I think some of them have been doing, but there is also a key important thing. They can't control it once it's deployed. So if they're deploying a somewhat secure medical device on an unsecured network or if they're turning off security controls for sake of ease of integrating into their clinical environment or if they're not patching critical systems around these medical devices, there's some shared responsibility there. And I think that's really important is that we need to be having not only a strong conversation with the device manufacturers, they do the right thing from the start, but we need to be able to be giving hospitals the appropriate resources, education, and holding them accountable when there's something egregious. Now, I don't think they should be asked to solve the issues of insecure medical devices because they just can't. But we've seen time and time again some pretty horrible security practices by healthcare institutions that have nothing to do with medical device manufacturers, but yet still could have patient safety implications, things like all the denial of service attacks we see, the ransomware attacks, the theft, I mean, it's not just critical healthcare infrastructure like hospitals, we've been seeing attacks on research. So I'm sure people have seen in the news and I think Jeff mentioned ransomware attacks on critical medical research infrastructure, state sponsored attacks on COVID research and vaccine trials data. So we have to have a shared responsibility. This is a whole ecosystem. It's really trying to figure out what's the best bang for our buck. And I feel like a lot of the conversations focus around medical devices because it's a much easier thing to tackle than trying to go to a rural access hospital in South Dakota and fixing their broken network and then being able to monitor that and continually say, you know, how are you going to, how are we gonna secure every hospital and clinic and doctor's office in the entire United States? That's a much harder problem. Carrier. Oh, that's it. I'd like to, sorry. I just wanted to say, right, it's not just the United States problem, right? It's a global problem. You know, hate to be the one to point it, but this is a big problem everywhere. And I know everyone on here is from the US. I'm not only oddball, I'm from Canada. I don't feel so alone. But the thing is just every hospital is different, right? There's no cookie cutter way of looking at the problem. Every hospital network is different. And I think sometimes what we forget is the patient record systems. Like, I never used to consider it. But then I sat back and thought from the cyber criminal perspective, it is the ultimate identity theft. It's the thing that you can keep selling constantly. And it's often the thing that's least protected because we're focusing so much on medical devices, we've forgotten what the treasure trove that holds. And I mean, that's moving to the cloud. So, you know, Microsoft is one of the big proponents that's moving everything now to the cloud in terms of patient records and clinical systems. So I'd be interested to see how the smaller hospitals are actually gonna be able to cope with having to do, you know, permitless defenses, you know, zero trust networks or even cloud solutions. Because I think the big hospitals like Mayo might be fine, but you have the smaller rural hospitals that I don't even know if they have IT or security on the premises. Or are they paying someone a third party to do the solution for them? I'd like, sorry. Go ahead, Ash, you go first. Okay, I wanted to go back to the topic of responsibility. And I would have to echo with Christian. I think it is a shared responsibility. Obviously, I think that, you know, I work for medical device manufacturer. I'm a software, a medid software engineer. I think about these things. I want to make sure that I'm doing my part to build really secure and awesome, safe products for end users. I care a lot about that. But I think it's easy to blame manufacturers and it's often more complicated. Again, it's disbalancing act. So I work for a medical device manufacturer that's a consulting firm. So we don't have any of our own in-house products. We have people who come to us and we have clients and some of them are business people and some of them are other medical device manufacturers that subcontract to us. But we have a lot of people who are just doctors and clinicians and other people who are really passionate about saving lives and they come to us and they're like, we have this idea. We see this gap. We desperately want to help our patients. Please help us build this. We have this tiny, tiny budget. We only have this amount of time to do it in because we got this little budget from these people but they say we have to get it done by this time. And we were working really, really hard to try to meet all these deadlines and actually make this little device that if we can give it to those doctors, it might save lives. And then for us to come in and try to say like, oh, well, we have to add all of these security things and maybe it's gonna affect the budget. It's gonna affect the timeline. And again, I care about these things. I'm in there, I'm advocating for this stuff but it's hard to say, if you only have this much budget and it's how do you choose, how do you tell them, okay, you shouldn't, we just aren't gonna be able to make this and implement this part of security, like where to draw the line. I think it's more complicated and harder, like in practice and it's hard to just say, well, the medical device manufacturer should just do it. How do you make that choice? I actually wanna add to that story. So I started working with Metronik and I'm a hard ass. Everyone knows that I say a bunch of things three years ago, who came out because I was unhappy with how our conversation because we had a legal situation. And I started the first six months of my project listening and getting to know the developers and the software, further engineers and the hardware engineers. And I sat back and listened and realized that these are real people that are wanting to make a real difference. Really, they are. And one of the specific engineers, Rob Mastow, had a big impact on my life because he said to me, he said, tell me what you want in your device. Tell me all the security you want, give the list to me and let's discuss this. And he turned around and said to me, well, so you're getting a new device every three years, right? And I said to him, well, no, there's no fucking way that I'm doing that to myself. And he's like, well, then you can't have all of that. He says there's trade-offs for everything. So specifically with embedded, implanted devices, the more security you add, the less lifetime you'll have on your battery. Because I can tell you the one thing that that is worth gold is how long your device lasts. Because it is a horrible thing to cut out. And I don't know if you guys know, but the more you cut it out, the more scar tissue forms and the worse it gets, right? I've had my device since 19. I wouldn't be here without it. So people building these things, I would be dead. I had two weeks left to live. That was the serious situation. So I think sometimes we want all the security in the world, but we don't realize these trade-offs. These devices are there to save lives. And if we try and push security too much, there's a whole lot of state of devices that will never come to market and they will never save lives. And that's one big lesson that I had to learn and I had to have humble pie. And I had to swallow my words and realize that it's people behind us wanting to make a difference with technology and science. And without that device, I wouldn't be here. I wouldn't have my kids. So technology does save lives in ways. We must just find the perfect balance to balance everything out. In your perspective, V, and folks like you, like Marie Moe who have these incredible security backgrounds, but also our patients is so incredibly valuable to be able to consider these types of trade-offs in like a human person, right? In like a very real way. And one of the things that Christian and I are working on is how do we understand the role of the patient as a stakeholder in these types of conversations? And is there a utility in having those types of conversations with this six-year-old man who needs to get his pacemaker revised? Or is there a trade-off between the potential risk of a vulnerability being exploited and then just putting him through that process of having it revised? And I think it highlights one of the things that Christian and I are so interested in is just developing a data and a science around this to be able to say in other types of medical therapies, we have very clear risks. If I give you a blood transfusion, there's about a one in 1.5 to 2 million chance that you might have an infection that's serious as a result of that or hard or lung issues. We really don't have the ability to have conversations with patients and say, this is the clear cut risk because thankfully we haven't seen many incidents of issues and that doesn't mean that there's an absence of risk, but it makes it harder for us to have discussions with patients and say, really, this is the type of math that you have to weigh as a patient with your own values and preferences and things like that. And we've kind of done some work on touching on this idea of a cybersecurity informed consent to say we talk about risks, medical risks of surgeries and medications and things like that that they may be on. Should we start thinking about cybersecurity as a potential risk that should be included in this conversation a clinician has with a patient? And Jessica, I know the FDA is also really interested in this and you've had a lot of really cool outreach types of events to be able to talk to people like V. And I think it's a challenge of moving beyond people who are security literate to people who may not have heard about these issues whatsoever for as much as we talk about them and to be able to kind of have that enter the conversation so that they can, without having a background in cybersecurity, weigh that as part of their clinical process. Yeah, I mean, I think that there's a couple things that I would bring up on this. I mean, like I think one, you had mentioned that FDA has been doing a lot of this and I think V were actually at the October 2019 meeting but we had the patient engagement advisory council meeting, the peak meeting. And so FDA is very much trying to make sure that the patient perspective is heard. And it's not just yearly meetings and things like that. As many of you know, Suzanne Schwartz is very approachable. You can pretty much get her email and email her and she'll find time to talk to you. So that is a really critical part of this because not that all of you are going around reading government white papers and things, but a couple of years ago, the FDA put out this medical device safety action plan and sort of setting aside the flowery language that all government documents, I think were like decreed that they must use. Essentially what it says is, look, it's exactly this issue of balancing the idea that patient safety is a twofold concern. There's the concern that we all typically think about on cybersecurity. Is this device safe and effective enough to justify its use with the patient? There's also the flip side of that though of what Asher's talking about and what V was talking about in that, what happens if the patient can't get the device? That is another, that's gotta be a thing that gets considered too. And so, you know, FDA, when we talk about cybersecurity, when we usually do it, we're very explicitly talking about do you have security controls? Do you have sufficient whatever to make sure that the device is cybersecurity? But a less talked about part of the work, but that is equally important is this idea that we recognize that sometimes devices need to get to patients because of the benefit that they're going to be able to provide. And maybe we don't have as much information. Maybe we don't have as much of assurance of the things that we would like to. But you know, this is the breakthrough device program. This is some of these other investigational devices that come up and that FDA has specific categories for and authority for them, things that you can do this. And then it's exactly this recognition that the other folks on the panel are pointing out, which is that in some cases, not having access to a device can be as big of a risk as having access to a device that isn't secure enough. And so, I think that is just an important point to highlight and that we as an agency keep in mind, but that this community in addition also keeps in mind as we have these kinds of conversations. All right, I wanna change this. Oh, can I, I'm gonna change the subject real quick because I heard a rumor there are thousands of hackers watching this stream right now. There are literally thousands of hackers from across the globe watching. You know, very interested in what we're talking about. And I'm sure they're thinking to themselves, wow, there are a lot of issues here I might not have been aware of anymore and they wanna get involved. So hackers out there wanting to get involved in this space, you know, how does, we've heard some examples, you know, there's been a group of hackers that have tried to defend hospitals against attackers during COVID. There are hackers printing a protective equipment for doctors on the front line, et cetera, nurses. But like what's the next generation of this, right? Like when the pandemic's done, how do hackers play a role in this space? Continuing research, is it bug bounties? You know, should a hospital have a bug bounty? Should a device manufacturer have a bug bounty? How does HIPAA play into that? You know, the protection of patient information. How do we really kind of bring hackers more into this space? Okay, I'm gonna jump in, yeah? I have a dream and I have a dream with a friend of mine, Nina. I wanna establish a laboratory setup with devices, right? That people can actually do research on, that we can put on networks, that we can scan for vulnerabilities, that we can reverse engineer, that we can work with manufacturers to make data. I already see people wanting to break things, right? We wanna find flaws, we wanna find problems, but often we don't wanna find the solution to the problem because that's the harder part. So I wanna say that if you find a device that's vulnerable, first remember, they are human lives at stake. Okay, this is not an ego thing. Try and find the solution to the problem that you found. Write it up as you would any scientific paper. I know it's hard work. I know it's not glorious and it's not sexy, but when you hand that over to the manufacturer, they have all the data and the more data they have, the faster that they can verify the claim that can start acting accordingly, right? It's not gonna be easy. Not all Indians are gonna wanna be friendly, but it is changing. But the biggest thing is the more thorough your research is, the less dispute they can be and the more we can find a way to fixing it. But it's not all about just bad bounties because I think that for me creates a negative thing of we're just finding the problems. We should be finding problems and then solving them. That's the hacker mindset. It's not just about pointing fingers and saying this is a problem. We should be working at the solution. We should be finding new ways of doing things. So I wanna see a lab full of medical devices that hackers can pull apart and we can build solutions. But not only hackers, you need people like Ash, right? Because I mean, we're good at the other side of it, but I can't build these devices. I wouldn't even know where to stop. So I think it needs to be a multidisciplinary approach. And even people like the FDA should be involved in it from a policy perspective. And everyone should just be working together. We should be building the bridges towards having synergy and having a space where we can explore this. And I think Biohack is awesome for that because they would really do a lot of that. Sorry, everyone, excuse my cat who just really, just really jumped into my lap. I was trying to prevent it and failed. He just failed the side of the chair. Welcome addition to the panel. So I love the notion of being part of the solution. And I think to your point, it does require multiple voices. You hit on a point that I think is really important, right? You as someone who does a lot of security research is not totally in the space of how do I build a medical device? I think we similarly need to not have the expectation of the people building medical devices to know how to solve some of these cybersecurity issues. So I think having the willingness to partner and really helping them not make cybersecurity their number two core capability and instead offering them solutions is absolutely critical on the way to make things actually stick. So should we have solution bounties then? Oh yeah, I love that. Oh, you should trademark that. Yeah. But I just, I wanted to bring up just a scenario. So it's not just for medical devices. You know, you have a medical device bug bounty, for example, if that's a good or a bad thing, we could talk hours on that. But you don't have the issue of patient health information. And so hospitals can't offer hackers to come in and poke around the periphery of their networks or to give them that type of experience because of a couple of things. One, their live networks are taking care of patients. We would hate for something bad to happen. But then two, there's so much protected health information floating around a hospital's network. And if a security researcher or a hacker in good faith trying to do right, find something, and it happens to be commingled with patient health information, then the hospital is required to report that as a HIPAA breach, right? So there are some, perhaps some would call barriers, perhaps some would call safeguards. But what I think it's done is it's made not just the delivery of healthcare itself, particularly difficult to engage with from the hacker perspective. And I think maybe we don't have to figure out that solution. Maybe that's one of the solution bounties we should offer and how can hackers get more involved, give us your ideas. But particularly at hospitals where we've talked about, they don't have a lot of in-house security expertise, if anything, but yet they're still asked to do all of that work. How can the community come together and help them out? Because that could be a lot of good right there. I mean, one very easy answer to that is just working in healthcare, right? So I would encourage anybody who's thinking about a career change, if you're looking for an area where you can make an incredible impact and have a lot of really challenging problems to solve, consider working for a hospital or a healthcare delivery organization or being adjacent to that in some way. We know that some of these hospitals we talked about in certain areas are critically underserved and may not even have a full-time IT security professional. So one way to get involved is just to join that effort as an actual employee of a healthcare organization. So I've got some specific suggestions, I think, and it's funny that you bring up HIPAA and other things like that. Because I think one of the best ways that folks like these could get involved, hackers, whatever, there are actually specific programs that are being offered now within the federal government that puts you in policy makers' offices and lets you be the cybersecurity expert in the room for those people. So there's the Tech Congress Fellowship. Some of you may have heard of it, some of you may have met some of the fellows. Krista Goyan, who's probably a well-known name for some of you all, was a Tech Congress fellow. He's now works permanently for a senator and is like that senator's tech person. But there's been dozens of others. They've got great folks, it's a great program, places you with either a house office or a senate office. And so that's one of the most direct ways that you can actually influence policy making in this sense is actually go be in the room when the policy is being made. So I would highly recommend that. I don't know when the application process opens for next year's class, but Travis Moore is your man, so go find Travis. The other thing I would mention is I am the cavalry. When I am the cavalry was first started, it was kind of this weird kooky thing for the Hill, for the United States Congress, who's very used to dealing with very buttoned up everyone's suit and tie, professional trade associations and things like that when they were trying to get information on whatever they needed to get information on. And I am the cavalry, sort of just kind of like came up to the Hill and was just like, hi, we're a bunch of cybersecurity experts. We'd like to talk to you about this. But the thing is, there were the most effective advocacy groups in like national policy making right now, because they were just so, you know, everyone could tell in a city where sometimes people's motivations are a little bit shrouded, that they just wanted to be there. They were there with good faith and they really just wanted to do the right thing and they were going to answer whatever questions they needed answering without bias. And so I know SEA works a ton with I am the Cavalry. I know that Hill works a ton with I am the Cavalry. And so if you want to get involved and have an opportunity to meet with, you know, the movers and shakers of the country of cybersecurity policy, I am the Cavalry is another really great place to do that. I just would like to perhaps add that from the other perspective, if you want to get a hold of working in a hospital or getting your hands just on some medical equipment or just, you know, having a conversation with the manufacturers. Three years ago, I spoke at Biohack Village. This was my first day of gone. This was my first ever meeting another group of hackers like myself. And that's how I was set on the path to do research. Right, that was how I got access to devices. I no longer had to smuggle them through customs at an airport because I bought them on eBay or, you know, because you can't buy them in South Africa, right? So I would buy them in the UK, I'd go visit a friend and I'd put them in my luggage and smuggle them back home so that I can work on them, right? Or even, you know, when they exploded my first devices they said, hey, give me, give me, give me. I want to have that in my doctor's why I'm like sentimental reasons, of course. But that's how I got my hands on. But here I walked into a place in Vegas that had devices that had a lab that had everything that, you know, makes a young hacker's heart, you know, do jumps. So if that tickles your fancy, then that's somewhere that you should go visit, right? You should go try your hand at it. You should go play with those devices, speak to the manufacturers and go see how they implement it. And I mean, even if you're a patient, that's in preclusive from giving advice at a hospital. I mean, I've given advice from my ICU bed. There's a reason I'm not allowed a laptop in anymore, right? Except for that the doctor wants me to ask more. Apparently that's the reason. But I mean, give advice, reach out, get involved. And if someone doesn't want to listen to you the first time, keep on trying because it's a patient's game, right? It's not gonna happen overnight. I've learned, I've dealt with many parties and it's not gonna happen fast because it's a big industry. But just get involved, get your hands dirty. Because that's how we change the world. Every new researcher that comes in the fold is a new mind that thinks differently. And that's, I think, how we change things one step at a time. I love that because I think the idiosyncratic notion that it takes a specific individual or an elite person or someone who's an expert to make a change is absolutely a misconception. So the notion of just getting involved, there's no inner circle here, right? Like I think every person in this community is more than willing to share their networks, share their expertise and share whatever corner of this problem that they have. It's a really hard problem and we need all the talent that we can muster up right now to really make a change. And I think if we look at the diversity of roles, functions, experience, thought process, background, education, everything, any underrepresentation of that population is indicative of missing a potential attack factor. And we really have to really have to account for that and try to bring it as many folks as we can. Yeah, I think this is a call to arms, right? This is how many hackers can we round up in one go? Well, not just even hackers, right? You know, everyone from physicians, from patients, from borders and policy makers. I think if we start breaking down the silos which we've been functioning in and independently working and start this collective movement, that's how we change the world. It's not a one person thing. It's never gonna be one person that's gonna come up with the magic solution or the silver bullet, right? It takes more than one village, as I always say, to solve the problem. And I think it's time in 2020 that we as a society pull together instead of apart. Because we've seen what a virus can do, right? It's locked us in our houses. So I think it's time we take our power back and we start moving the world into a positive future. Because I can tell you from a patient perspective not having access to being able to go into a cardiologist office because we're afraid of COVID because currently it's more dangerous going into a hospital. A denial of patient care is a non-acceptable, non-negotiable, I'm not gonna stand for it. And I think we should be kicking down the doors and saying, you know, now's the time, not yesterday, now. I love it. Are there any better sentiments to go out on than that? I think it's easy to be cynical about all this and it's easy to say we're having the same conversations year after year and the problems are still there. But I mean, you know, in 2015, the biohacking village didn't exist and the idea of people from device manufacturers bringing stuff for hackers to poke and prod at would be laughed out of the room. So for people who don't believe that we can do awesome things when we work together and hang out at places like DEF CON, I mean, we have evidence to the contrary. So I think that's about an hour. And I'm looking forward to seeing you guys in a little bit for a Q and A. Take care, everyone. Thank you, DEF CON for having us. A shout out to Nikita especially for taking another shot at us. We're gonna go ahead and open it up to Q and A here in five, four, three, two, one.