 Ivan Pappaliti is a human performance specialist with the US Forest Service. He is a longtime lead plane pilot with a strong background in accident investigation both on the aviation side and the ground firefighter side. Ivan describes what he calls the gap. The gap is a comparison between how work is designed or structured from a management perspective like the IRPG and FireLine handbook for example compared to how the work is actually performed by firefighters on the ground in the thick of action. There is always a gap, there will always be a gap. He identifies this gap as an obstacle in communicating with firefighters about their experiences that led to unfavorable outcomes. In this module we'll discuss some of the reasons why this creates problems from an accident prevention side and the steps that are being taken to promote discussion between fire safety managers and operational firefighters. What is an accident? An accident is an unforeseen, unplanned event or circumstance. If we're looking at something that was not planned and not intended why do we do accident investigation? The reason that we do it is for prevention because if we're really looking at prevention the thing that we should be doing is simply looking for lessons learned out of these things so that they can be avoided. So we can describe the gap in greater detail as a result of this adverse outcome. So accident investigations came up with things like failure to follow policy and procedures and in the dude fire we came up with failure to follow the 10 standard firefighting orders and that became the groundwork for us to take a look at our accidents in terms of the 10 standard firefighting orders. So what we ended up with was a blame and train cycle. This situation then identified human error as the cause of the accident and individuals were either counseled or disciplined. This led to less trust and less communication. The gap between management and firefighters was widening and the problems or conditions that supported the accident were still there. We can go back in and we can make a story out of what happened that makes complete sense and sadly in human related, in human performance related adverse outcomes or accidents sometimes it doesn't make sense. It takes courage to talk about an accident that may be perceived as a mistake. A prideful culture doesn't take lightly what may be viewed as an embarrassing situation especially in fire culture but steps are being taken to develop systems that reward firefighters who tell their story. Fire leadership training emphasizes sharing experiences and as a way to improve decision making, improve overall performance and break down this communication gap. We can observe things in a mechanical sense. We can take things to the laboratory and we can dissect them and take them down to their lowest component. But as we start to do this with human judgment it becomes more and more difficult. We can't really break human judgment down into component parts because no two people fail at exactly the same point. If I put you in a circumstance and I put myself in a circumstance we would fail at a different point even though the circumstances were the same. And in fact if I were in the same circumstance over and over and over again I wouldn't fail at the same point because I would have learned from my experience and this is very important. This is called heuristics. How much risk will you accept in the pursuit of your fire line objectives? Your experience and judgment will determine the course of action you choose. There will always be an opportunity cost. If we spend more time on the process, what does it cost us in the final outcome? This goes back to pressures of effectiveness, thoroughness, tradeoff. If we look at the ten standard firefighting orders for example and we look at number ten, fight, fire aggressively having provided for safety first. This is a complete effectiveness, thoroughness, tradeoff. Because if we want to be completely effective at fighting the fire we're going to be aggressive, but if we're going to be completely thorough we're going to be completely safe. How can you be both completely safe and completely effective? What we're asking our people to do is use their judgment to come to some middle ground of application of this standard firefighting order to understand that or to develop a way of implementing this philosophy that they have to make some sort of effectiveness, thoroughness, tradeoff. Now our desire is that they make that tradeoff on the basis of safety. What is normal or standard? Because we have to compare if we're going to say that somebody did something wrong we have to compare that against something which is right. So what is normal and standard? Is that well defined? How far is below for example? And was the rule clear or conflicted? Is where we really should be going? Most of these things avoid the question of why did these actions or these decisions make sense to the individual at the time. How many people think that Sully Sullenberg, the pilot of the airplane that crashed on the Hudson was a hero? Most people feel that he was a hero, but let me submit something to you. If he had done exactly the same thing and there had been a rogue wave, a helicopter that took off from the hill spot that's right near where he landed, there had been a ferry that went across the Hudson River that he impacted. If that had happened, he would be considered a villain for doing exactly the same thing that he did for which he's now considered to be a hero. That's categorically unfair. It's unfair to vilify people and it's unfair to prop them up as heroes. And if you listen to the speeches that Sullenberg is doing now, he doesn't claim to be a hero. He claims to be a pilot who is doing his job, doing the best that he could with a bad situation. The truth is that people create safety in a very unsafe world. That people, through naturalistic decision-making, through heuristics, through taking action when they perceive that they're going down a dangerous path, they actually create safety. It's not that the world is safe and there's a bad person doing a bad thing. We pluck them out, put a good person in. The truth is the world is unsafe and the people within our world create safety. So this kind of brings this to a different perspective. The system's view starts to take a more holistic approach to the entire accident scenario. And instead of focusing so much on what happened, it tries to understand why it happened. So instead of looking at things in terms of error, it looks at things instead in terms of what conditions supported error. And James Reason said that you cannot change the human condition, but you can change the conditions under which humans work. And this should be our goal. Our goal in management and safety program management in particular is to create a situation where our people are more likely to be successful. For individuals that are involved in circumstances, in operations, they have many dimensions of information that they're being bombarded with from many, many directions. They have to take the stimulus and process the stimulus and make decisions. These decisions result in courses of action. As each one of these decisions is made, the individual does not know the outcome. And it's very much like being in a tunnel. And most of the time, the tunnel ends up with a successful outcome. In fact, in our firefighting operations, about 98% of the time, we end up with a completely successful outcome. So the individual has an expectation as they're going through this maze or this tunnel that the outcome is going to be successful. In the 2% where we have problems, the accident investigators generally come back in and they look at that tunnel with very great clarity of hindsight. They can now look at each one of these decision points, using their judgment ascribe the word error to these decisions because they know the outcome. But for the individual who's involved in making those decisions, it's not anything but clear. And most assuredly, if they knew that the outcome was going to be adverse, they wouldn't take the action. So for them, the clarity doesn't exist. And that's why it's really unfair to apply hindsight bias to these decisions and call them errors having known the outcome. In fact, they're only errors knowing the outcome. And the more complex a situation is, the more error likely that situation becomes. The greater the possibilities are of having an error by adding more complexity. So there's a problem to this approach to the model. The other thing that can happen as a result of this linearity is that there's a lack of understanding of the 98% of the time that things are successful. In other words, 98% of the time the individual recognizes the potential outcome, takes corrective action and avoids that outcome. The individual should have that freedom to make that naturalistic determination of what they should do to avoid the potential conflict. They have to be able to see it to avoid it. They have to be able to react to it in order to avoid it. And that's not done through rules and regulations. That's done through heuristics and experience. And that's why we're 98% effective.