 Please warm welcome Mike Jerbic. I had a few cards that I might need later. So all right, well, thank you. Along with Lydia, delighted to be here. And as much, or I'm actually more delighted to be here because I have a number of students with me. I have a number of colleagues with me who have collaborated on a project. And it's just extremely exciting and really humbling to have the kind of people here that I'm really truly blessed to work with. And so if you see some students from San Jose State on their names, please help them feel welcome and part of this community. I think I'm sure that you already have. But I will announce that Dan O'Neill and Kevin Marai who are here just completed their fair certifications within the last week as a result of the fair seminar we had in the just last month. And we have Isha Nijain, Elizabeth Torres, and Horst Keller here with us. I don't, and Jeanette Carvajal who are also in training to complete their certifications hopefully in the next couple of weeks. And so I hope that you welcome them as you see them around. So please recognize those folks when you get a chance. We started starting one year ago at the open group meeting last winter. The idea of collaborating, of integrating academia and universities together to help people, to help people, to help this organization produce better products and produce them more quickly. And I wanna talk today just about the first fruits from that collaboration. You'll see more about it at our booth outside maybe during the breaks or at lunch and what have you. But we wanna get to the first fruits. And we started out by asking that year ago really how can we get more work done in less time? Because what we keep finding is of course the members are really busy and that open group activities are not always counted as part of your day job. And so we said okay, well if we can make members more productive, put them to their highest valued use, we can maybe put them as guide as the maybe core idea contributors, but maybe some other people can do some of what might be called grunt work. And that, work having worked with students, students are really good at that. We should use students for that as much as possible. And it's not only that we should use them, they benefit from that. They get to grow their credentials, they get to get some experience that employers want. And they can show what they do in front of an audience like you. And so we had the idea to specialize, let members do what they're best at, and let students do what they're best at. And when we combine that together, we ought to be able to do more with less. So that's what we've been up to. And this is the first product of that collaboration. Only one year later we have been working on and are close to having a complete draft ready for internal review that's of the whole thing of the open fair process guide. A guide that we can use across the open group and elsewhere to help first time practitioners use fair and understand how to use it. Because we saw that as a member organization as being one of the core impediments to adoption. So we're out fixing it. And we're having students like John Lindford help us without and the people like Ava Kuiper from HP contribute important ideas as a leader, guide or innovator, but let offload some of that implementation to a writer as good as John. And then we found out that we really need to have an example of this process guide, which drove us to actually do an analysis using that guide, using the ideas in that guide. And that's what the regional health authority risk analysis was. It was born from the healthcare forum. The basic idea, basic problem statement came from the healthcare forum. And on that project we have Sushmita Kasturi who is a student researcher on that project. She's just finished her degree and is graduated now. Be honest, Strangeland, PhD is a managing solution architect at Capgemini who has provided an awful lot of pointers into this country-specific research and has helped translate from Norwegian to English which has been really helpful at times. And then we have Stig Haagestand who's contributed. You may know Stig as part of the healthcare forum. He contributed the main problem statement. And so that's where we start from. So let's look at that regional health authority project. We'll start at kind of a high level. A couple of interesting things to me from this is that you think of Norway, most people probably haven't thought a lot about it but we've gotten into it a little bit. About four and a half million people, right? That's a little over 1% of the population of the United States. Okay, so by that measure, if you're US based, you're thinking this is kind of small. And on the other side of it, you'll notice that the income per capita, GDP per capita, is bigger than the United States by a fair amount due to the oil reserves and income from oil that Norway commands. There's a problem in Norway that the regional health authority, which is the government socialized medicine, essentially, their system, that with end stage renal disease, there's a, the patient prevalence is growing at about 5% per year. There's about 1,240 people on dialysis there and of those about, you know, about 16% are taking that treatment at home. Home-based dialysis is an innovative treatment that costs less and may of the, though the research is maybe a little bit ambiguous, offer far better healthcare outcomes to those patients on it. Once diagnosed, a new patient represents a net present value cost to the healthcare system in Norway of anywhere between $150 and about $315,000. In other words, it's kind of expensive, okay? And so home treatments that might save between $30,000 to $50,000 per year per patient is something that the regional health authority is interested in, okay? They, if possible, they would like to expand the penetration of home dialysis and encourage patients to accept it. And so we took as a problem statement that part of what patients are doing, what they have to incur from that home treatment is very similar to what they incur in the hospital, that they would, and that from a security standpoint, when they needed to transmit data between their home dialysis machine that's collecting blood data, session data, medical data and other stuff, the security and privacy policy of the health authority and the compliance people in that health authority physically prevent by policy the connection of those home dialysis machines to the hospital so that the patients have to physically drive their USB thumb drives of the data back and forth to the hospital. That's our present statement. And the question is, why are they doing this? Well, right, it's a policy question. It's a policy problem, okay? And so if I can ask Sushmita and John to come up, we're going to give you an example of what the dialogue might be between our enterprising young architect, Sushmita, who observes this problem and our security and compliance person within the security organization that is going to discuss how we're going to decide how to open up this connection given how we see things today. Okay, hey, so I was wondering why can't we read and update patient data and patient treatment plans online? Like there are so many benefits. With online updating, you have the patient's quality of life improves, the patient, the security with the home dialysis improves and the productivity of the doctors and medical service providers increase. And I also feel like, you know, with all of this, the population of people willing to accept home dialysis is going to rise. Great benefits, but the network transports and it just isn't secure. Doesn't secure? I mean, what about a memory stick though? Or we already have a VPN solution in place. We could just use that. No, we can't use the VPN solution. We can't do that. Okay, why can't probably we could just put, you know, crypto encryption on both sides and secure it that way? No, we don't have a policy in place and we also don't have control over the patient's computer. Oh, come on. Then let's just give the patients a secure computer and let's just have a control over that. No, we don't have a policy for that either. Thank you. Don't worry, they'll come back. So in other words, right? Here we have, you know, all distillate, the fear of risk and privacy, you know, right? It prevents even the discussion of whether these HD, these home dialysis machines will be allowed to connect. And the perceived costs of keeping this policy are very high and those perceived costs, you know, would include the benefits to the patient that we don't allow, that patients reduce the amount of travel time, they increase the amount of hours they have during the week that they can do what they want instead of being in the car. Hospitals can be more efficient in how they allocate physician time where they're not doing that today. So these policies have a cost, just a hidden cost of the cost of foregone opportunity. So when pressed, you know, these risks, these fears centered on when we talk to people, and this is the dialogue that came from Stig Haagistan, from the healthcare forum. You know, the risks associated with malware, the risk to the hospitals associated with malware, including in kind of more specifically the risks associated with ransomware, and then the fears over patient privacy. That these fears are preventing us from maybe capturing these foregone opportunities, these foregone benefits. So naturally, right, because we're economists, how would we look at this? Is the right way to look at this? Let's estimate those foregone opportunity costs. Let's see what's really on the table, okay? And then let's estimate, let's look at what are the real risks associated with privacy and malware and ransomware, and see what those estimated costs are so that we can look at a cost-benefit analysis. Do these social costs that see the benefits or not? That's not saying policies are good or bad. It's simply saying, how is the world? Do the costs exceed the benefits? That's a positive analysis, not one subject to judgment. And along the way, maybe we'll ask some interesting questions that would cause us to critically think whether we're even asking the right questions at all. Okay? So let's challenge the basic assumptions, some of the basic foundational rationales and see where things wind up from there. All right, so let's do it. What are the foregone benefits? What is Norway giving up by adhering to this? Well, as envisioned by the enterprise architect, patients travel in wait time, right? We're doing this, if you're on dialysis, you're getting treatment three times a week on average. Okay? It is not fun. Keep your kidneys because they're actually really valuable to you and it's a pain to do anything else. Two to three times a week, four to eight hours lost, two to three times a week in transit and moving these USB thumb drives around. Transportation costs that the country of Norway reimburses patients for their travel costs as part of their healthcare policy. Okay? They've internalized, the government has internalized that cost in delivering healthcare. And so you can see kind of what some of the bottom lines are, right? Total travel avoided is about $50,000 per USD per patient per year given Norway's geography and average travel times, average distances and so on. Okay? And in about 10, 203 patients, that would be about $10 million a year of combined lost leisure and material travel costs. It seems like an extraordinarily high amount of foregone opportunity. Okay? And of course this is not on a falling trend, it's on a rising trend, right? About 5% per year. Okay? That looks pretty scary. So what are we getting from this policy? What are the risks avoided and how do we quantify it? Well guess what? If you're gonna talk to me or if you're gonna talk to any of those students I highlighted, right? If you mention the word risk, you're gonna get something really specific, right? This is what FAIR was designed to do. And this is what we analyzed and are gonna put into the process guide. We're just wrapping it up. So there's gonna be two analyses because there's two stakeholders. One's the patient on the privacy side and one is the hospital on the malware and ransomware side. Okay? So we're gonna take a quick look at each of those. On the privacy side. Okay? Mainly what is the asset here? Well the asset is control over information about you. Okay? About the data subject. Where we have a difficulty though, we could identify the asset but we really had a difficulty identifying who wants this asset? Who would the threat agent be? And part of the thing that drives that is that unlike the United States, Norway's single payer system takes away a lot of the economic incentives for getting at private health information. Because at least in the US experience, we use that to help commit medical fraud. Okay? So absent a clear threat agent. Absent a clear loss scenario. We have to kind of conclude that the losses associated with privacy are pretty speculative. Okay? And absent a good economic model for a rational threat agent to wanna go get this data from a single purpose home dialysis machine. It seems like the risk associated with privacy should be very, very low. Okay? How about the risks associated with malware and ransomware? Let's look at that. Well, oh, the asset, right? From malware and especially ransomware, right? You're looking at the confidentiality, integrity, availability of information. And, you know, vital information. And we assumed, and we could challenge that, but we assumed that those threat agents were most likely gonna be financially motivated. And especially in the case of the ransomware agent, it's hard to figure out a reason why you'd commit ransomware unless it was for the financial gain. Okay? So, you know, the loss scenario. Well, a financially motivated threat agent could try to attack and penetrate a home dialysis machine online connected to the hospital to penetrate the hospital that way. It's possible that that would be true. But there's, you know, a couple of hundred of them and they're special purpose machines, right? And it gets down to again, without data, because there is, these things are not connected, there's no data out there to look at. We're estimating what the data would be. We concluded this. We can't answer this question unless we make assumptions about the architecture, the systems architecture between that home dialysis system and the hospital. Because we envision that common off-the-shelf technology if so chosen could drive that risk to any arbitrarily small level that we wanted. And so we said, well, maybe we're asking the wrong question. Maybe this should be posed as an engineering challenge. For the benefits received, can you build a system that can reduce their risk to an acceptable level so that we can get the social benefits from online connecting these machines to hospitals? Okay? And we think that that, like,