1 00:00:02,470 --> 00:00:05,819 Ivan Pupilidy is a human performance specialist with the U.S.. Forest Service. 2 00:00:06,160 --> 00:00:13,229 He is a long time lead plane pilot with a strong background in accident investigation, both on the aviation side and the ground firefighter side. 3 00:00:13,809 --> 00:00:20,219 Ivan describes what he calls "The Gap." The gap is a comparison between how work is designed, or structured from a management perspective, 4 00:00:20,919 --> 00:00:23,879 like the IRPG and Fire Line Handbook for example, 5 00:00:24,070 --> 00:00:28,079 compared to how the work is actually performed by firefighters on the ground in the thick of action. 6 00:00:28,240 --> 00:00:35,399 There is always a gap, there will always be a gap. He identifies this gap as an obstacle in communicating with firefighters about their 7 00:00:35,620 --> 00:00:38,400 experiences that led to unfavorable outcomes. In this module, 8 00:00:38,400 --> 00:00:45,360 we'll discuss some of the reasons why this creates problems from an accident prevention side and the steps that are being taken to promote 9 00:00:45,360 --> 00:00:49,240 discussion between fire safety managers and operational firefighters. 10 00:00:50,620 --> 00:00:54,900 What is an accident? An accident is an unforeseen and unplanned event or circumstance. 11 00:00:55,420 --> 00:00:58,740 If we're looking at something that was not planned and not intended, 12 00:00:58,960 --> 00:01:00,840 why do we do accident investigation? 13 00:01:01,080 --> 00:01:06,780 The reason that we do it is for prevention, because if we're really looking at prevention, the thing that we should be doing is simply 14 00:01:06,780 --> 00:01:15,100 Looking for lessons learned out of these things so that they can be avoided. So we can describe the gap in greater detail as a result of this adverse outcome. 15 00:01:17,140 --> 00:01:23,540 So accident investigations came up with things like failure to follow policy and procedures. In the Dude Fire, we came up with 16 00:01:23,580 --> 00:01:27,720 failure to follow the Ten Standard Firefighting Orders and that became 17 00:01:27,720 --> 00:01:32,640 the groundwork for us to take a look at our accidents in terms of the 10 Standard Firefighting Orders. 18 00:01:32,860 --> 00:01:39,740 So what we ended up with, was a blame and train cycle. This situation then identified human error as the cause of the accident 19 00:01:39,860 --> 00:01:45,020 and individuals were either counseled or disciplined. This led to less trust and less communication. 20 00:01:45,200 --> 00:01:51,540 The gap between management and firefighters was widening and the problems or conditions that supported the accident were still there. 21 00:01:52,990 --> 00:01:58,979 We can go back in and we can make a story out of what happened that makes complete sense 22 00:01:59,520 --> 00:02:05,360 and sadly, in human related, in human performance related adverse outcomes or accidents, 23 00:02:05,920 --> 00:02:08,100 sometimes it doesn't make sense. 24 00:02:08,420 --> 00:02:12,420 It takes courage to talk about an accident that may be perceived as a mistake. 25 00:02:12,580 --> 00:02:18,600 A prideful culture doesn't take lightly what may be viewed as an embarrassing situation, especially in fire culture. 26 00:02:18,790 --> 00:02:23,220 But steps are being taken to develop systems that reward firefighters who tell their story. 27 00:02:23,680 --> 00:02:28,470 Fire leadership training emphasizes sharing experiences and as a way to improve decision-making, 28 00:02:28,810 --> 00:02:32,039 improve overall performance and break down this communication gap. 29 00:02:34,190 --> 00:02:37,100 We can observe things in a mechanical sense, we can take things to the 30 00:02:37,200 --> 00:02:40,759 laboratory and we can just dissect them and take them down to their lowest component. 31 00:02:40,800 --> 00:02:44,320 But as we start to do this with human judgment, it becomes more and more difficult. 32 00:02:44,440 --> 00:02:47,620 We can't really break human judgment down into component parts, 33 00:02:47,700 --> 00:02:52,340 because no two people fail at exactly the same point. If I put you in a circumstance 34 00:02:52,340 --> 00:02:56,900 and I put myself in a circumstance, we would fail it at a different point, even though the 35 00:02:57,090 --> 00:03:03,230 circumstances were the same and in fact, if we if I were in the same circumstance over and over and over again, 36 00:03:03,230 --> 00:03:09,529 I wouldn't fail at the same point, because I would have learned from my experience and this is very important. This is called heuristics. 37 00:03:09,990 --> 00:03:17,240 How much risk will you accept in the pursuit of your fireline objectives? Your experience and judgment will determine the course of action you choose. 38 00:03:17,760 --> 00:03:21,709 It will always be an opportunity cost. If we spend more time on the process, 39 00:03:21,960 --> 00:03:24,640 what does it cost us in the final outcome? 40 00:03:24,740 --> 00:03:34,300 This goes back to, uh, pressures of effectiveness, thoroughness, trade-off. If we look at the Ten Standard Firefighting Orders for example, and we look at number 10, fight fire aggressively, 41 00:03:34,520 --> 00:03:37,000 having provided for safety first. 42 00:03:37,080 --> 00:03:44,029 This is a complete effectiveness, thoroughness trade-off. Because if we want to be completely effective at fighting the fire, 43 00:03:44,029 --> 00:03:48,460 we're going to be aggressive. But if we're gonna be completely thorough, we're going to be completely safe. 44 00:03:48,460 --> 00:03:53,100 How can you be both completely safe and completely effective? 45 00:03:53,100 --> 00:03:55,980 What we're asking our people to do is use their judgment. 46 00:03:56,040 --> 00:03:58,100 To come to some middle ground 47 00:03:58,380 --> 00:04:04,480 of application of this Standard Firefighting Order, to understand that, or to develop a way of 48 00:04:05,060 --> 00:04:10,180 implementing this philosophy, that they have to make some sort of effectiveness thoroughness trade-off. 49 00:04:10,187 --> 00:04:14,067 Now our desire is that they make that trade-off on the basis of safety. 50 00:04:15,893 --> 00:04:21,473 What is normal or standard? Because we have to compare if we're going to say that somebody did something wrong. 51 00:04:21,833 --> 00:04:23,933 We have to compare that against something which is right. 52 00:04:23,960 --> 00:04:26,390 So what is normal and standard? Is that well-defined? 53 00:04:26,880 --> 00:04:32,869 How far is below, for example? And was the rule clear or conflicted? Is where we really should be going. 54 00:04:33,210 --> 00:04:40,370 Most of these things avoid the question of why did these actions or these decisions make sense to the individual at the time? 55 00:04:40,800 --> 00:04:43,040 How many people think that Sully Sullenberger, 56 00:04:43,890 --> 00:04:46,789 the pilot of the airplane that crashed on the Hudson, was a hero? 57 00:04:47,300 --> 00:04:49,080 Most people feel that he was a hero. 58 00:04:49,080 --> 00:04:55,160 But, let me submit something to you. If he had done exactly the same thing and there had been a rogue wave. 59 00:04:55,680 --> 00:04:57,799 A helicopter the take took off from the helispot, 60 00:04:57,800 --> 00:05:04,250 that's right near where he landed. If there had been a ferry that went across the Hudson River that he impacted. If that had happened, 61 00:05:05,520 --> 00:05:12,199 He would be considered a villain for doing exactly the same thing that he did, for which he's now considered to be a hero. 62 00:05:13,140 --> 00:05:15,140 That's categorically unfair. 63 00:05:15,690 --> 00:05:20,209 It's unfair to vilify people, and it's unfair to prop them up as heroes. 64 00:05:20,209 --> 00:05:22,400 And if you listen to the speeches that 65 00:05:22,560 --> 00:05:24,120 Sullenberger's doing now, 66 00:05:24,120 --> 00:05:29,660 He doesn't claim to be a hero. He claims to be a pilot who was doing his job, doing the best that he could with a bad situation. 67 00:05:29,670 --> 00:05:36,140 The truth is, that people create safety in a very unsafe world. That people, through 68 00:05:36,330 --> 00:05:42,319 naturalistic decision-making, through heuristics, through taking action when they perceive that they're going down a dangerous path, 69 00:05:43,110 --> 00:05:48,979 they actually create safety. It's not that the world is safe, and there's a bad person doing a bad thing, 70 00:05:48,980 --> 00:05:56,060 we pluck them out, put a good person in. The truth is, the world is unsafe and the people within our world create safety. 71 00:05:56,730 --> 00:06:04,189 So this this kind of brings this to a different perspective. The system's view starts to take a more holistic approach to the entire 72 00:06:05,010 --> 00:06:10,189 accident scenario and instead of focusing so much on what happened, it tries to understand 73 00:06:10,190 --> 00:06:16,999 why it happened. So, instead of looking at things in terms of error, it looks at things, instead, in terms of 74 00:06:17,310 --> 00:06:24,199 what conditions supported error? And James Reason said, that, "You cannot change the human condition, 75 00:06:24,200 --> 00:06:26,720 but you can change the conditions under which humans work." 76 00:06:26,720 --> 00:06:28,350 And this should be our goal. 77 00:06:28,350 --> 00:06:37,040 Our goal, and management and safety program management, in particular, is to create a situation where our people are more likely to be successful. 78 00:06:40,280 --> 00:06:44,000 For individuals that are involved in, in circumstances in operations, 79 00:06:44,470 --> 00:06:50,700 they have many dimensions of information, that are that they're being bombarded with from many, many directions. 80 00:06:50,700 --> 00:06:57,360 They have to take this stimulus and process this stimulus and make decisions. These decisions result in courses of action. 81 00:06:57,940 --> 00:07:05,130 As each one of these decisions is made, the individual does not know the outcome. And it's very much like being in a tunnel, and 82 00:07:05,320 --> 00:07:12,629 most of the time, the tunnel ends up with a successful outcome. In fact, in our firefighting operations, about 98% of the time, 83 00:07:12,850 --> 00:07:15,330 we end up with a completely successful outcome. 84 00:07:16,140 --> 00:07:23,300 So, the individual has an expectation, as they're going through this, this maze, or this tunnel, that the outcome is going to be successful. 85 00:07:23,300 --> 00:07:28,460 In the two-percent, where we have problems, the accident investigators 86 00:07:28,570 --> 00:07:34,260 generally come back in, and they look at that tunnel, with very great clarity of hindsight. 87 00:07:34,660 --> 00:07:37,480 They can now look at each one of these decision points. 88 00:07:37,500 --> 00:07:43,040 Using their judgment, ascribe the word "error" to these decisions, because they know the outcome. 89 00:07:43,930 --> 00:07:46,830 But, for the individual who's involved in making those decisions, 90 00:07:47,500 --> 00:07:53,850 it's anything but clear. And most assuredly, if they knew that the outcome was going to be adverse, 91 00:07:54,040 --> 00:08:00,120 they wouldn't take the action. So for them, the clarity doesn't exist. And that's why it's really 92 00:08:00,120 --> 00:08:04,260 unfair to apply hindsight bias to these decisions and call them "errors," 93 00:08:05,140 --> 00:08:09,340 having known the outcome. In fact, they're only errors knowing the outcome. 94 00:08:10,330 --> 00:08:15,629 And the more complex a situation is, the more error-likely that situation becomes. 95 00:08:16,030 --> 00:08:23,070 The greater the possibilities are, of having an error, by adding more complexity. So there's a problem to this, this 96 00:08:23,230 --> 00:08:29,099 approach to the model. The other thing that can happen as a result of this linearity, is, is that, 97 00:08:29,440 --> 00:08:33,630 there's a lack of understanding of the 98% of the time that things are successful. 98 00:08:34,330 --> 00:08:41,489 In other words, 98% of the time, the individual recognizes the potential outcome, takes corrective action, 99 00:08:41,979 --> 00:08:43,979 and avoids that outcome. 100 00:08:44,110 --> 00:08:50,430 The individual should have that freedom to make that naturalistic determination of what they should do. 101 00:08:51,550 --> 00:08:53,510 To avoid the potential conflict, 102 00:08:53,510 --> 00:08:55,510 they have to be able to see it, to avoid it. 103 00:08:55,520 --> 00:09:00,410 They have to be able to react to it in order to avoid it. And that's not done through rules and regulations. 104 00:09:00,720 --> 00:09:05,599 That's done through heuristics and experience, and that's why we're 98% effective.