Rush to judgment

In our rush to judgment we rarely intend to do harm. Often, we react to incomplete or even scant information, fit it into our own mental model of how things should be and then jump to conclusions that could inflict harm.

Last week, CBS Morning News showed a film clip of a man snagging a baseball from a kid who was sitting directly in front of him. The less than 10 second clip resulted in the vilification of the man as a bully who stole the ball from the little kid. One day later the same news show provided an apology to the man for jumping to judgment. Why the change? Simply put, the news media learned the context around the actions and found that the man had caught and given away several balls to those around him including the boy in front of him.

Context is something that helps us to walk in the shoes of others – but it takes time and effort to learn context. Often it is much easier to live in a land of blame and shame. Our organizational responses to incidents and accidents have followed the same path and resulted in investigation reports that name individuals as the cause of accidents without a mechanism that helps the investigator to learn or discover context.

Our tendency is to oversimplify – enter the concept of requisite variety, which implies that the complexity of our assessment of systems has to meet the complexity of the systems that we are scrutinizing. Yet, so many of our processes are not designed to embrace complex systems.

This is what we learned in the US Forest Service, as we attempted to “investigate” fatal accidents. The processes we had drove us toward judgment. People were simply admonished and told to follow the rules. When it was found that they did not follow a rule, we didn’t ask questions about the rule, we simply wrote, “The worker failed to follow rules, regulations, policy or procedures.” The relevance of the conditions seemed irrelevant. Our Serious Accident Investigation Guide actually said,

“The causes of most accidents are the result of failures to observe established policies, procedures and controls.”

The stated purpose of investigations was prevention, yet we were not learning how to prevent accidents. We were; however, perfecting our skills of blaming others.

We had to learn how to learn from our systems following an accident and, more importantly, we had to learn how to learn when the system was delivering the unexpected. These unexpected situations exist outside our ability to fully predict and, therefore, cannot be fully regulated or controlled.

The result of our investigations was often admonishment, sometimes criminal prosecution and often a demand for simple compliance. For those of us who were working the fire-line or flying planes in firefighting operations, this approach simply felt like our leaders were telling us not to have accidents, because we knew that we could not simply follow all the rules. Asking pilots not to crash may feel good and give a sense that we have met our responsibility, but it had no effect on our accident rates. Our definition of safety had to change and along with the new definition, we had to reconsider our metrics of success. We needed to marry compliance with innovation – comply when it made sense and the rules fit the situation, while simultaneously giving room for innovation when the system delivered the unexpected.

Our journey took many years. It was led by field personnel who wanted a different approach and by leaders who recognized that the system was complex. A milestone was the development of learning focused approaches to the organizational response to incidents and accidents. One big breakthrough came when we realized that our old processes robbed us of the context that held blame in check. The result of our routine response to accidents was distrust in both the system and leadership and this led directly to silence. We finally concluded that the currency of safety is information and that we had to protect the trust of employees in order to ensure we could understand the difference between work as imagined and work as performed.

We also learned that the system is dynamic, and rules needed to be reviewed to meet modern information. Adding rules made the system more cumbersome and vulnerable. In dynamic (complex) systems there is variability that defies prediction. Workers have to recognize the situation as novel, make sense of new (often conflicting) information, learn in the moment, and devise innovation(s) to fit the new conditions.

Investigations also had to understand, rather than judge actions, and learn how to capture the conditions (factors) that influenced people to do what they did. A principle emerged for us – people act in ways that make sense to them based on their training, heuristics biases and the conditions they perceive at the time, not because they are bad actors. From this perspective, an accident in not seen as a choice, after all who would choose to have an accident? Rather it is seen as a natural outgrowth of normal system and human variability.

As I mentioned in the first paragraph, “In our rush to judgment we rarely intend to do harm. Often, we react to incomplete or even scant information, fit it into our own mental model of how things should be and then jump to conclusions that could inflict harm.” Workers are just leaders. We are all people who are influenced by what we see, hear and feel. These conditions form the context – The conditions that influence our decisions and actions. It is our duty to understand those conditions. To paraphrase professor James Reason: You cannot change the human condition, but you can change the conditions under which people work. We learned – in order to meet the challenge imposed by Reason, we had to shape our processes to look for those conditions.

 

I have to thank Professors Reuben McDaniel, Karl Weick, Sidney Dekker, Eric Hollnagel and David Woods, along with the over 400 wildland firefighters who have died in the US since 1996, for their contributions to the concepts explored in this op-ed.

Editor’s Note: For more from Ivan, check out his TedX talk on this subject.

19 Comments

  1. David Christenson Reply

    Excellent article describing the transformation the US Forest Service has begun from finding fault and placing blame to learning and improving. Conditions influencing decisions must become visible and the efforts to shine a light on them are worth the time they take to generate understanding. Our people deserve nothing less. Thanks Ivan.

  2. Rodney Currey Reply

    Proverb- To err is human. “An Essay on Criticism,” by Alexander Pope.

    We should expect people not to follow rules polices and procedures at times. None of us do on occasions. So why do we expect others to?

    1. Ivan Pupulidy Post author Reply

      Hello Rodney, Great comment; first we may have to ask who is expecting rules to be followed and when? The driver here may be a belief that rules keep you safe. As an pilot for the Coast Guard, Air Force and US Forest Service, there were always rules to be followed and broken (if greater emergency exists, FAR/AIM). There was an overall culture of compliance in aviation, with and understanding that rules may not apply to every situation. This understanding made many of the rules pragmatic and, especially, in the Coast Guard the rules had meaning that could be explained (google NATOPS). With the explanation, the rules had context – this helped us to understand how to apply them and when they did not fit.

  3. Adam Johns Reply

    An excellent article, thanks Ivan. One psychological concept that needs greater understanding around judgement is ‘Fundamental Attribution Error’. I found a great video that explains this very well (https://www.youtube.com/watch?v=Z9OF3wHDw0M).

    In our personal lives we’re all guilty of ‘Dispositional’ attribution around the behaviour of others, rather than focusing on a ‘Situational’ attribution approach. This links nicely to the systems thinking principle of Local Rationality, highlighted by Shorrock et al in the EUROCONTROL ‘Ten Principles for Systems Thinking in Safety’ white paper. We need to ensure that our organisational investigation or learning processes avoid dispositional attribution and instead aim to learn what is was about the situation (context) that contributed to someone’s behaviour or actions.

    I’d love to read more on this topic, Ivan.

    1. Ivan Pupulidy Post author Reply

      Hello Adam,

      I will ask Ron if he can post the paper on the subject that I delivered in Ispra, Italy, last year. If you would like a personal copy, send your email to me on linkedin messages and I will send it to you.

  4. mjteaguesfg Reply

    Great article. Assigning blame to accident seems satisfying but it interferes with learning. The wildland fire service is doing better at this than the structural side of the house. We are trying to change things in structural firefighting but it is a slow process.

    1. Ivan Pupulidy Post author Reply

      Great observation, Blame largely eliminates the ability to learn from incidents an accidents. People “Lawyer up” or simply do not offer information and the agency doing the blaming is often in an intractable position and less likely to engaged in meaningful inquiry. You are quite right!

  5. Rodney Currey Reply

    By reframing “rules” “procedures” and “polices” to say like “requirements” or “requisites” allows our discourse within the investigation to seek further information on why requirements could not be completed.

    This intself could be seen as a pause point or an opportunity to change tact or direction to achieve what is required.
    Requirements appear that they could be changed if nessacary, more so than rules etc.
    It denotes that “they may be willing to negotiate” given curtain circumstances. Whereas rules, procedures and policies don’t.

    Definitions

    Rules-one of a set of explicit or understood regulations or principles governing conduct or procedure within a particular area of activity.

    Procedures-an established or official way of doing something.

    Policies-adopted or proposed by an organization or individual.

    Requirement or a requisite
    -refer to that which is necessary. A requirement is some quality or performance demanded of a person in accordance with certain fixed regulations: requirements for admission to college. A requisite is not imposed from outside; it is a factor which is judged necessary according to the nature of things, or to the circumstances of the case: Efficiency is a requisite for success in business. Requisite may also refer to a concrete object judged necessary: the requisites for perfect grooming. 2. order, command, injunction, directive, demand, claim.

  6. Suzanne Jackson Reply

    So about 20 per year since 1996. Was each fatal event unique? Were they all heroes? As Canada now has fires every summer as the “new normal”, I wonder if we need to change our view of forest fire at the societal level. What has the USFS learned about “fighting” forest fire?

    1. Ivan Pupulidy Post author Reply

      Hi Suzanne,

      You pose interesting questions that the USFS is struggling to answer – for example, was Smokey off target when he said, “Only you can prevent forest fires” should his message have been more like, “Only you can prevent your house from burning up in a wildland fire.” The struggle with what is appropriate risk for a firefighter and what is unreasonable cannot be argued in the front line and must be part of a larger enterprise wide risk management approach.

      Each fatal event is similar and each is unique. Arguably many of the conditions that influenced decisions and actions remained the same (at least prior to the implementation of the Learning Review, where we focus on learning about conditions). Traditional methods of demanding more and different performance from firefighters dominated our investigations and often blamed the dead or injured for the accident.

      One thing the USFS seems to be learning and which can be a hard pill to swallow, is how to get the largest footprint of fire on the landscape with the lowest amount of risk. Fire is part of the wester forest ecosystem – most species are fire adapted, some even require fire to germinate. The western fauna has a 3 – 35 year return interval on fire, when it is not interfered with. Fire Ecologists argue that successes in keeping fires small has contributed the increased fuel load and coupled with warmer, longer summers, many of the fires are unstoppable.

      I hope that helps address the “new normal” and let me close by saying – perhaps there is no normal!

      Ivan

  7. Sean A. Walker Reply

    A good article, I have seen over the years in Ireland that a section on human error has been added to the list of potential causes in accident reporting forms. Very easy to tick box. I joined an organisation in 2008 and within a year had done 3 versions of accident investigation training, on the fourth I said STOP!!! Are we expecting accidents? why don’t we pump our resources into prevention and that’s what I did on my sites in Ireland, we had no Lost time accidents in 7 years, yes we had minor accidents, our investigations looked at all potential root causes, we did a team whiteboard sessions with the appropriate workers, which meant we didn’t overly focus on human error we looked holistically at all possibilities and eliminated them as a team. I felt one or two persons who had no direct involvement with the processes and systems should not be investigating an accident without the workers involvement and also asking them for their solutions for prevention. We were the only site in the group with no Lost Time Accidents in 7 years. We all worked as a team (buddies) to prevent accidents and challenge other transient contractors to work safely.

  8. Gary Wong Reply

    James Reason’s statement “You cannot change the human condition, but you can change the conditions under which people work” recognizes the nature of complex adaptive systems. It reinforces the notion that people don’t create safety but create the conditions that enable safety (or danger) to emerge.

    In novel situations like wildfires, best practices which work in stable, repeatable times have little value. Heuristics become crucial for safety and survival. The US Marines follow 3 – head to high ground, keep moving, stay in communication – in the chaos of battle. What would be the heuristics that firefighters follow?

  9. Ivan Pupulidy Post author Reply

    Hello Gary,

    You pose two interesting Questions: what if the answer is that wildland fire is driven by production – so the heuristically driven responses are Anchor, Flank and Pinch and Keep the fire as small as possible. Safety driven Heuristics have not been popular until the last couple of years. We saw the introduction of Stop, Talk, Think, ACT – even this seems to be missing something from my perspective and that is Learn.

    I think the first paragraph may not be consistent with my view of conditions. I see conditions existing in complex systems as things that influence decisions and actions rather than just things that are created by people in the system. My experience in wildland fire operations has demonstrated that people do create safety, often through actions and decisions. The conditions are all filtered through heuristics and, in wildland fire, they are rarely discussed – thus the importance of Stop, Talk, Think, Act.

    My experience as a firefighting pilot and a military pilot has shown me that the verbalization coupled with crew sensemaking is a very strong way to create safety in real-world operations.

    I hope this was helpful – thank you for the response and I would like to hear more from you.

    Ivan

    1. garyswong Reply

      Hi, Ivan. I agree there are more things that can influence decisions and actions just than people. I only mentioned people to match Reason’s statement. Other agents in a complex system include machines, events, even ideas. It would fascinating to listen to stories told by firefighters and hear the decisions that were made. I wonder if there could be underlying patterns that might reveal safety heuristics.

      Regarding learning, I’ve thought about Bayes theorem as a heuristic. Initially try something as a best guess/hunch, learn from the new information collected, then do something based on the system’s response. Maybe Act, Learn, Adapt. Hmm, we’re probably touching into OODA loop territory.

      Thanks for the interesting posting,
      Gary

  10. John R Castleman Reply

    Too many people do rush to judgment and point to blame. Safety and risk professionals have known for decades that this is a lazy and often false way of stopping the search for truth and reason. As people who know better we have to lead others to question beyond blame to understand the context in which decisions and actions were taken. It may be as important to understand the good things that were done and why they may have worked as much as choices which did not work out and how to know this in future.

  11. Rob Poyner Reply

    I am a wildland firefighter. Reading through these comments I believe the wildland community safety driven heuristics are LCES (Lookouts, Communications, Escape Routes, Safety Zones). I am not certain that this is the best tool, just pointing out that I think LCES most closely correlates with the marine heuristics.

Leave a Reply to Ivan Pupulidy Cancel reply

Your email address will not be published. Required fields are marked *