 OK, so let's continue where we left. We saw some of the things that are common to accidents, what happens, the frequency, or so. We didn't really speak about the frequency. So the question, what do we do about this? So we learned from, this was the end of the talk two hours ago or an hour ago. So this is what we learned. Now the question is, what do we do about it? As I mentioned before, when there is a big accident, people get motivated to do something about that, at least to talk about doing something about it. So not too long after that article in the New York Times, or the series of articles in the New York Times, a lot of organizations, AAPM among them, ASTRO, and a group of organizations published a number of major articles saying, we need to do something about this. And this is an article that appeared in, I think it's called Practical Radiation Oncology. The authors are Dr. Handy that many of you may know of. And Michael Ehrman at the time was president of AAPM, or past president. This one is an article that summarizes the consensus about the number of major organizations. You can see the list there, AAPM, ASTRO, the ACR. The Wu and Wu is in radiation therapy and radiology. And each of those articles basically comes up with recommendations. All these things happen. Let's do A, B, C, and D. So there was plenty of recommendations about what to do. And this is a list of the seven major articles that appeared over a period of two or three years. And you have the references, those at the end of the paper. But if you look for each one of those reports, this is a number of recommendations that they make. IRKO factions, the one from ASTRO, for instance, six major pieces of advice. So if you sum all those recommendations, there is about 117 recommendations. What would you do with that? I mean, that's crazy. How can you make sense of this? So Peter Damskom, which is a medical physicist in Canada and is involved with a group which I had the fortune to work a little bit with, I treat safely. You may have heard about that group. Published a little article, tries to sort through those recommendations. And among the seven, remember, there were seven of them, he looked at the top recommendations and he listed how many of those seven recommended something to be done about this particular aspect of safety. So everybody recommended that you need to improve education and training, all seven of them. They all agree. You need to do more training, more education. And then you went down the list, and six of the seven recommended improving the staffing and the mix of skills of the staff, standard operating procedures documentation. So I'm going to take just a few of these, the major ones, and I'm not going to go into the particular details about education and training. Basically, that's what we are doing right now. And most of you will have had some education and training, but education and training is not something that happens and you forget about it. This is a constant, a continuum process. And continuum medical education or continuum professional development, and you have all these terms, is something that has to be part of your career. This is a given. You don't stop learning. But what I want to perhaps spend a little bit of time is on this concept, multi-layered prevention, risk assessment methods, learning and reporting systems. And you will see in a moment why this is important for this aspect of safety. And I will spend only a few minutes about analyzing what's called root cause analysis or RCA. And the issue of developing a safety culture, the culture of safety. So the IAA has done a great job in publishing material, not only reports, but also publishing material, PowerPoint presentations, and I'm going to be using some of their slides right now in the next 5, 10 minutes. And I encourage you to go to that site, download it, and go through the material itself. And you can use that freely if you want to teach. You want to share that with your colleagues, therapists, radiation oncologists, and so on. You should use that. So this is just one part of a series of documents and presentations, which is on reporting, investigating, and preventing accidental exposures. So preventing accidents or accidental exposures has a lot to do with communication. Because as you probably saw with some of the things that I mentioned before and things that you may have had experience with, therapy is a team process. And when there is a team, there is more than one person, and you need to be able to communicate from one person to the other. What is this doing? What should I do in relation to that? And how do I give that information, pass that information to make sure that there is no misunderstanding to somebody else? So rules for communicating, and that's one of things that I'm not going to go into the detail. But there is a lot of things being published about this. It's the use of checklists. And checklists sounds like a very low-tech thing. You have all this sophisticated equipment. What's a checklist? It's basically a list of things. Make sure that you did this. Make sure that you did that. Check, check, check. I did that. Now, if you ever take an airplane, there is about dozens of checklists that have to be completed. Everything completed before the pilot will take off. And this is something that the airline industry is one of the safest industries around, believe it or not. We always hear about airline accidents and so on. If you look statistically, it's just a very, very minute fraction. So this is just one example from this presentation. An accelerator is being maintained. Somebody is doing. The engineer is doing some adjustments. He turns over it to the clinic. How many of you in your facility have a rule that after maintenance of a linear accelerator, a physicist needs to check? That's pretty good. I mean, it's almost 60%, 70% of you. How many don't have a reason to check it or in your facility doesn't happen because whatever reason, any of you or some of you may not be involved in the clinic. So that's a different situation. So the documentation of processes and the checklists that go with those processes is a very important part of assuring that things happen the way they are supposed to. So what is multi-layered prevention, basically? It's what's called the concept of defense in depth. If you have probably not a bad example, it's a jail. When they put somebody in jail, they don't just lock the door. They don't lock the door. They make sure that the windows you cannot break through. They will have guards around them. And then they will have barbed wire outside and maybe another wall of barbed wire and so on. This is a typical example, not a good one. I'm not promulgating that. But this is an example of defense in depth. They want to avoid the person to get out. So they have multiple layers, multiple obstacles for this to happen. What's this to happen in this case? The person getting out. Well, in radiation therapy, we have also things that are happening. And this is independent of what the process is. Anything that you do will have a sequence of possible events, all kinds of things happening. And in radiation therapy, everything we do could have multiple outcomes. What you want to do is assure that you only have the outcome that you intend. When you turn on the linear accelerator, what could happen? There's radiation produced, correct? What could happen? Anything wrong? Well, you may not have the patient in the room. Or you may have the wrong person in the room. Or you may be in the room. Or your colleague that was supposed to come out was still in the room. There is a multitude of things that could happen, correct? Turning the accelerator is just one event. And the outcomes can be many. So we want to assure that we get the outcome that we intend is to treat that one patient in the right place in the right time, et cetera. So the first step is to put all these layers to make sure that only what we want is going to happen. So this is an illustration. So this is also another component of that. It's the issue of being aware of what is happening. So you have conceptually a line of defense. You understand what the process should be. So there is barriers and layers. So this is just an example from that presentation. So you have multiple events that happen in the course of radiation therapy. And each of them can provide an outcome. So what we want to do is put layers of safety that when that happens, I mean, we either stop something that we don't want from happening and we ensure that only the things that we want happen. And so we'll take one example, for instance. Somebody does an SSD calculation, applies an SSD correction. And if there is no check or balance or nothing, a number of things can happen. An example, what can happen? You calculate numbers. No idea? Well, you can make an arithmetic mistake. You can just use the wrong formula. A number of things can happen. So we want to start putting barriers. One of the barriers is an independent check calculation. Somebody makes one calculation. Somebody else makes it, hopefully, independently. Otherwise, it's not a check. But that even can fail the system because we may not catch a problem. So we have somebody else check the numbers. I mean, are they reasonable? I mean, we always treat 480 centigrade prescription. We always have numbers which are such and such. So somebody's checking them again. So there is another barrier. Then we may have in vivo dosimetry, just to check on the patient. So it's another barrier. So hopefully, if we continue doing these multiple barriers, we have a written procedure. And the final one is the conceptual. Are we aware that this is a reasonable result or not? If we come and we treat a patient at the shorter SSD, what do we expect the numbers to be? We expect them to be shorter. Shorter time, we need the shorter time to treat closer to the source. So this is a conceptual issue. I mean, does this make sense? If we all of a sudden see that the numbers are going the wrong direction, we should say stop. OK, so this is some of you may have heard about the Swiss cheese model. I mean, basically, you have all these holes. And an error will just filter through all the holes if they are aligned in the right direction. So this is what we are trying to do with multi-layer prevention. What I would suggest that you go to the site, take any case that you can think of and see how you in your department, in your group, are implementing this multi-layer defense system. So if we wanted to improve or make our system, let's say we only find that we have only one barrier in place and we want to put a new barrier in place. How do we figure it out? What's the process? Well, one very useful tool is what's called drawing a process map. And a process map can have different forms. It can be on a piece of paper. What is important about the process map is to get all the people involved in the process to sit together at the table. And it doesn't have to be three months. It can be one hour. And right on a piece of paper diagram without fancy graphics. And get yourself to all agree what happens when we bring the patient until the patient is treated. Just all the steps. Because once you get all the steps, first of all, you accomplish one thing is that everybody is agreeing on a process, or at least apparently agreeing on the process. And now you can say, well, if I didn't catch it, I didn't catch my mistake here, maybe I'll catch it here. And then you can put another barrier at that point to do a check. So you build your barriers according to your process. Process maps can be of different variety. I mean, this is another example of a process map from IMRT. I don't expect you to even read this. But you can see that it can be very complex. It can be very simple. And you can focus on just a particular part of it. Like, for instance, preparing the treatment plan. And I have this enlarged here. So you have all the steps that go into treatment planning. And you can decide, well, I will put my checks and balances along these points of the process. That's where I can catch a mistake. So process plans are useful because they give you the opportunity to work with your team and to come up to an agreement. And if there is no agreement, that's the first indication that you need to improve. Because if somebody understands one thing and somebody else understands something else, that is an opening for a mistake. So I want to talk for a few minutes because we are all the time talking about safety. And what is safety in our environment in radiation therapy? I mean, safety has many meanings. If you Google safety, you will find hundreds and hundreds of definitions probably. But safety in radiation therapy is the absence of an unacceptable risk, something bad happening. And what is something bad happening in radiation therapy? Most of the time, we focus on overdoses, accidents, terrible. I mean, 4,000 times the dose and so on. But harm also happens in radiation therapy when we don't control the tumor, correct? So that's also harm to the patient. So suboptimal tumor control is as important in the safety of a patient as not overdosing him. So in parallel to that, I want to look at what is the definition of quality in radiotherapy. And the practical definition, again, you can find different definitions of what's called quality in the process. But in radiotherapy is the degree that radiation therapy is consistent with the professional knowledge. So if, like Colin was saying, I mean, the ABS as these guidelines for this much should be your prescription on such a situation, that's what we want to do. I mean, you don't want to have somebody that all the way out of in a limb. And the other part of it, that's the prescription. But the other part is that we deliver that within certain tolerance. And many of us, and if you look at the clinical protocol, the QA protocols and so on, they talk about your field size should be precise to 1 or plus minus 1 millimeter, plus minus 2 millimeters. Your gantry rotation, you know, half a millimeter for radio surgery, et cetera. So this is what we say, this is the consensus in the profession. But let's ask a question, is safety an issue in radiotherapy? Is radiotherapy unsafe? What would you say? I mean, we heard about all these accidents, didn't we? Well, there are some studies, and none of these studies agree 100%, but they agree within at least an order of magnitude, which is pretty good for something that we cannot quite define. And like there was, from a series of accidents in New York State, they calculated that the frequency of serious incidents, remember, there were 621 in a period of eight years or something like that, that was less, a little bit over 100% of serious accidents in New York State. Varian did their calculation. They came with 2,000th of a percent. In the UK, they came with 3,000th. I am pretty surprised how much this agree, because we are not talking about something that you can actually measure, you know, with a ruler. But let's take about 100% of a percent. So how does that compare with what we call safe industries? If you are running, if you are taking an airplane, assuming that you don't have a German pilot that wants to commit suicide, it's extremely safe. This is the rate of serious incidents for air flight, much, much safer than radiation therapy. So is radiation therapy unsafe? Well, the thing is that we have this situation. We only hear, if let's say our benefit curve is like a Gaussian because we don't have a better way to assume. Let's just assume it's a random distribution. And we say, well, if we have a target dose, for instance, or a target treatment in the middle here, we want all our patients to be here. But since it's a distribution, we hear about the accidents when we are way, way here. That's the big overdoses. And we don't hear so much about the big underdoses. But there is some place in between, in the shoulder of this distribution, which we can say our quality is not really what it should be. So we can gain most of the benefit for the radiation treatment. So let's just take some numbers. And these are really rough approximations. About three-quarter of a million people get radiation therapy in the US per year. If we take that 100% probability of a serious incident, that should translate to about 75 serious accidents per year in the United States. Serious. How many do we get that we hear about? I mean, even in that major thing, the New York Times, they came out with maybe, I don't know, 20. So most likely, we don't hear about all of them. But this is of the correct order of magnitude. But if we just look how many will fall beyond two standard deviations in this Gaussian, we could calculate that about, and that would be about 2.6%, about 20,000 in the United States will get suboptimal treatment per year. 20,000 compared to the few dozens that we hear about. OK, well, this is a great opportunity. You agree? That's a great opportunity because that's where we are going to learn from. We learn very little from the major accidents. We learn things of principle. But in the day-to-day, in the routine, that's not where we learn the most. Now, when I started at Long Island Jewish, I mean, 85, 30 years ago, and I know that Colin has done something very similar, more or less at the same time or before I did it. I started with a paper form. I didn't have computers at the time. I mean, they were just coming on board. And definitely, people didn't have access to them. So we started with a form. I mean, I say, every time that there is a variance, I want it reported and looked at. So what did we put in that form? First of all, we have to define what's a variance. I mean, I used the term variance because I didn't know I could choose a better word. But the word that you use, it doesn't matter. Basically, it was the difference between what you expect, that that's the norm, that that's the way it should be, and what actually happened. And I would, if you want to describe it in another way, it's an event that departs from the normal, from the routine, from what we expect daily, okay? So we called it a variance report. And what did we report it? This is what the information we collected. I mean, they reported on which machine happened. We reported it, the date, and what was, you know, a rough categorization of details, was something related to the blocks, the monitor unit calculations, something else, and who was the therapist, because sometimes was reported by a therapist, somebody who was the else, and then a little blurb, I mean, to get started to describe what happened. And then what did we do with it? Well, first of all, we brought anything that happened to the attention to the attending physician. Within a couple of weeks, the attending physician said, I don't want to be bothered with all this nonsense. I mean, that's minor things, don't come to me with that. That's, you know, if there was something serious, we will bring it to the physician and the physicians, the chief therapy supervisor, and the physician. But we also had all these forms collected. So sharing, you don't know sharing. No, of course not. It was one of our chief therapists, and he was in charge of keeping track of all these things and getting the pile of paper. So we had to do something with that. So what did we do with that information? We tried to analyze the specific of each thing, each variance, each occurrence, with three goals. I mean, at the time, what is the effect on the patient? Is this going to affect the patient or not? Okay. And a lot of those variances didn't end up affecting the patient at all because we caught them before nothing happened. But then we wanted to learn a lesson from what happened and to see if we need to make a change in the way we do things. And because we were a reporting state, as you remember, I mentioned that in New York, we had to report things, we wanted to see in which category this fell and what was the need to report. And I kind of arbitrarily say, let me do this in two channels. One of them is what is the effect of what happened? And I selected to do this, prevented an error which was prevented, was corrected, or was uncorrectable because we already delivered the dose wrong or something like that. We couldn't take it back. And the other one, whether it was, how do we report it? Do we report it in our QA report weekly or monthly? Do we have to record it for legal reasons within the hospitals? Or we need to report it as a misadministration to the state. That was basically a choice. But when we evaluated the significance of the error, my instructions to everybody in the department was you need to look at each error as if it was the only barrier that will prevent an accident. You can say, well, I made the mistake in calculating the monitor units, but it doesn't matter because somebody else will catch it in the second check. No, you have to assume that all, that your step is the one and only step that will prevent an error or a miss up. And that's the concept that we all need to understand. So just to give you an example, somebody goes into the treatment room. There are two therapists. One is setting up the patient. The other one is bringing a block to put on the machine, the treatment block on the head of the machine, brings the block to the therapist that is setting the patient, hands in the block, and the other therapist says, no, this is for the other field. I'm treating this field after this one. Give me the other block, okay? What do you think? Is that an error? Was that an incident or not? It's a near miss. It's a near miss actually. Well, it's a near miss. I mean, sure. I mean, the other therapist found that he gave him the blocks for the wrong field. Nothing happened. I mean, the patient was treated perfectly well. But from the conceptual point of view, this was an incident. It was a near miss because if the other one didn't pay attention, he would have been treated with the wrong block. So what is typically what happens in reality when something like this happens? Do people report it as a mistake? No. I mean, okay, I give you the wrong field. Sure, no problem. I give you the right one. Isn't that the attitude most of the time? So this is exactly where we have an opportunity to learn because why did the first, the therapist that handed the block give the wrong block? There is something in our process. How did he choose the block? Was the block labeled? Did he say which field? I mean, do we have like a color coding or things like that? Yeah, that's another thing. I mean, do we have a sequence that we always treat going from country zero to or from country 180 to the other side? Or sometimes, I mean, we do it this way and sometimes the other way. Does one therapist do it one way and the other therapist does it another way? All these things, what I'm saying is it requires a mindset to look at these things being alert what could have gone wrong. And it takes a lot of time for people to get into that mindset and get used to it and to use it all the time, okay? So we talked about this. So once we had all this information collected, what did we do with it? I mean, we had a discussion. Let's say, well, let's do ABC, let's change this process, let's do something like that. And then we reported it to our committee. Well, there are much newer incident reports because obviously, I mean, I could not distribute my list of events or incidents or cases to the hospital in Manhattan because we are in Queens and why would I bring my piece of papers to them? So there was nobody would learn anything besides us in our department. There was no sharing of information. And not because we wanted to keep it secret. That's another reason. Sometimes people will be very, very possessive. I mean, I don't want anybody to know that in my department, things are not 100% perfect. A lot of people advertise, you come to us because we are 100% perfect. Okay? You have seen those advertisements? Sure. Okay. So there is much newer systems which we call incident reporting systems. This is just an example from Johns Hopkins and they started the web-based system. If you look at that, the components were already in that piece of paper. But this is much more sophisticated, much more useful. They started using it in 2007 and included near misses, as we say. And there were 600 minor incidents for each clinical critical error. And these are statistics that have been published. So if we want to learn anything, we need to learn from those and not from the major accidents. It's very important to know about the major accidents. That's where the learning can happen or must happen. So this is not the only incident learning system that it's available. Roses, which was started by Estro. Estro, I think, yeah. Northern countries. Oh, the Northern countries, yeah. But I think it was, at least it says here that it was under the auspices of Estro. And they started that, I think, something like 10, 12 years ago. But they are much more sophisticated and they will collate things from multiple institutions. And that's where we can gain the advantage of being able to learn from others. Now, you can get into Roses. You have to register your department and then you can start reporting incidents. And it's a voluntary system, obviously. And what is useful about that, that you can look at the information anonymized and look at the incidents that happen anywhere in anybody, in anybody's clinic. And you can look at that by dates, by type of incidents. The only problem with that is if you just, I don't know if you can read that. Yeah. Is that the only description here is whatever the person reported describes. And this is very uneven, as you can see here. Somebody's saying being, and I'm not talking about the English, being mis-ray treatment by not correct adjusted target volume. Do you understand that? I don't know. I mean, I'm not sure that I understand. I can think what they meant, but it's not clear. On the other end, I mean, this is a patient was scheduled for IMRT treatment, 25 times two grade, and you have a much better description. So it's uneven. The process is not quite even distributed. But it's very good because you can learn from somebody else's mistake. If you have the time to read this all the time. Now the IEA also created a newer system which is a little bit more complex and more sophisticated, which is the saffron system. And I think I registered there while I was still at the hospital. So I can still access it. I cannot report events anymore because I'm not associated with a clinic. So in order to report, you have to have an affiliation with a clinic. And the things that you can do here, not only you can browse of the different reports, but they have a very good organized tool to categorize the different steps in the radiation therapy process. Remember the process map that you should do for your own clinic? They did one that is very general for all radiotherapy. And not only you can go there, but you can get also the incident reports. And you do it in two steps. First of all, you have to register with Nucleus, which is an IEA database, and then you sign in. And what you have is that a lot of the itemized factors are categorized with the fine, very granular. So you can see what happens with treatment planning. What happens with treatment delivery? What happens with the prescriptions? And not only that, but you can get a very, very fine breakdown down to four or five levels down of each step of the radiation therapy process. So this is very useful for two reasons. Even if you don't report anything, it can be very useful for you to focus where is your report belong? Which one of these things, or where is anything that happens? Even if you don't want to report, where does it belong in this process? So that's very useful. And you can get obviously report the particular incidents. Yeah. While you register, you need to put some of your, this has to be done, I believe, only once. When you register your facility, you put all the equipment, how many physicists you have, how many radiation oncologists, and part of the processes that you have in place. And this is one of the things that I think it's important. Even if you decide, you don't want to bother with this at all, okay? Take that thing, register your name, and you could go to that. Before you register your facility, look at the things. What safety barriers you have in place at your clinic? Just do that exercise. Sit down with your physician, with your therapist, and go through that list. See, do you have all these things in place? Just as a reminder. Okay, so we talked about this, and now Astro and the APM recently, just last year, developed another system of patient incident reporting, which is called the incident learning system. And this is not anymore just reporting, it's learning from those reports. And that's relatively new. I mean, I've not submitted anything or anything, but this is something that if you want, I think any clinic can report to that anonymously, because one of the biggest issues in the United States, why people didn't want to report, is they don't want to get sued. They don't want somebody to find out that something happened in their clinic. So this is totally anonymous. Once you report it, only you can access your data, and in the aggregate, people can see things. So basically, the summary of this is, what do we need to report on the track? Any explicit event, frequent events, that's important to report, otherwise we are missing the greatest opportunity, one in 600, it's not enough, okay? Random events, actual errors, you want to report those. Potential errors, which are near misses, almost, almost happened. I think the word near miss, it's a wrong word, because what does it mean, a near miss? If an airplane is going to hit this, okay, and it doesn't hit, what is it, a near miss, or it's a near miss, or a near hit? It's probably a near hit, it's almost hit it, okay? So I don't know why people use the near miss word, but it sounds to me it's reversed, okay? And the corrective measures, because if you are just collecting the information and doing nothing about it, you are just wasting everybody's time, and it's demoralizing. If people report, report, report, and nothing happens, it's totally demoralizing, you are just creating exactly the opposite of what you think you will do. So the main thing is to do, and the system doesn't have to be perfect in order to be useful. You have to be able to have a system where people can report. Now, reporting, as many of you know, depends on the culture of the site, okay? How many of you feel that reporting something wrong is going to affect you negatively? In your clinic, one, two, the rest are shy, I think. I think the rest are shy, okay? At the bottom of, at the end of the day, I think the first reaction, and that was a cultural thing for many, many years, until recently, where it's starting to change gradually. The idea was something was wrong, let's find who did it, let's punish him, okay, and go on with life, okay? That was the biggest way to react. Well, we need to reverse it completely, unless we have a culture of safety, which means that everybody is not only comfortable to report, but know that reporting is a welcome thing, that's a positive thing, okay? This is not going to change, okay? So, culture, the reporting system has to have some clear guidelines, and the people that look at us need to have the awareness that we are looking about this, what I say before, I mean, looking at this as if this is the only barrier that existed, so you need to have a bigger focus than looking at the specific little thing that happened, and they're willing less to implement changes and to follow up on those changes. Let's say that you started recording and you find that every week you have six cases where the plans didn't get done in time. They get done five minutes before the patient is on the table, okay? And this happens one week, the next week is three, the following week is 12, okay? All right, well, something has to be done. You change your process, you want to continue monitoring it, the number. You want to see that this improved, that the next month it's maybe only one, and the next month none, okay? So, you want to be able to measure, implement things and measure them. So, what is James' reason is I think it's called industrial engineer or something. He has very good publications, there is a book about organizational cultures, and he classifies into three categories. To the left, to your left is the pathological culture. People don't want to know that there is an error, just don't tell me, okay? If you tell me, I have to do something about it. The messenger is being shot. Whoever brings the bad news is the culprit, get rid of him, okay? Nobody takes responsibility for things. Oh, this happened, I don't know anything about it. I mean, I didn't do it, yeah? And so on and so forth. At the other end, to your right, is what's called the real safety culture. You actively seek people to report things. The messages are trained and awarded and nobody's blamed because blaming is not a solution. The solution is making change your process. I would say 95% of the cases where something goes wrong, nobody intended on purpose to have it wrong. Maybe one case, well, you get rid of that one, but the majority of cases, people want to do a good job. The problem is in the process, not in the people. The people are trying to do the best they can. If something is happening, it's because the process is not perfect, okay? So, well, this is about the final, the follow-up plan. I just talked about that. If you have some time, or even if you don't have some time, make some time, go to this website, I treat safely, and they have a lot of presentations, some of the material that I have here is from that series of presentations, and they have a lot of information about safety culture and how to do things better. And when something happens, something goes really wrong, how do we find out the causes? Well, there is what's called an engineering process, which is root cause analysis. Root cause analysis, when do we do it? When do we have to resort to this big canon, root cause analysis? Sounds like a very important thing, isn't it? And there is books about that, but we want to use it when there is a real serious event, of course. We want to really get to the bottom of the bucket to find out what happened. When there is systematic events, I mean everything that it's minor thing, but it keeps happening, happening, happening again, or things that are very sporadic, but they happen with frequency. And so the root cause analysis is to find out what happened, how do we do it? I mean, you can read about that, but basically there has three components, or four components. One is to collect the information, do not judge at all, just collect the information, get the people involved, interview them, sit around the table, what happened? Focus on the what, not to the why, not who, but what? This happened, I mean the wrong tray was handed. And then you start asking, once you have all this drawn on a piece of paper, start asking why? Why did it happen? And you should go, why? So the therapist handed the wrong tray. That's what happened. Why did he have the wrong tray? So you find out why. Well, he took it from the table. Why did he take it from the table? Well, that's where they put it. The block cutter put it on the table. Why did the block cutter put it on the table instead of in the rack with the patient's name? So that's another way. And you have to go back. Basically, the process says, at least you have to get five ways. Why, why, why, why? Until you kind of come down the process and try to find the initial cause of the problem. Then you have to come up with some recommendations to remediate that situation and implement them and monitor. That's all that root cause analysis is. And obviously we don't have time to go into much detail. So is it a reporting and learning system? First of all, as we saw, must be friendly for reporting. They have to be supportive, responsive, dynamic. And we need to have a culture which is free of fear of punishment. Unless we do that, this is not going to work. If people are afraid to report, I mean, we don't want to hear, we don't want to know. So we have many tools and checklist and standard procedure for our handoffs. It is a list of resources that you can have, you can look through those, some of the references that I mentioned. And basically the summary of this, the art of learning from our mistakes. Okay, that's basically the bottom line. Thank you very much. Thank you.