Understanding and Adding to the Investigation Toolbox

For the last century, the evolution of accident investigation can be tied to research and the scientific advancements in how we view our work systems[1]. Three major lenses of scientific research emerge as we begin to examine key influences on accident investigation processes: Classic mid-century faith in engineering was termed, Scientific Management, which was followed by Systems Thinking and, ultimately, an emerging understanding of Complex Adaptive Systems.


Generation of scientific theory and its influence on work systems and investigations.

In the early 1900s Scientific Management was the dominant system used to explain and improve work. Much of this approach emerged from the industrial revolution and was captured by Frederick Taylor. Emphasis was placed on efficiency of machine and labor, the latter of which focused on the analytical assessment of workflows. The overriding goal of this system was to increase production by making all components of the system more efficient. Accidents were generally viewed as failures of individuals and people were removed and replaced as if they were failed mechanical components.

It took time for organizations to realize that firing people was not achieving the goals of increased efficiency. Employers had to retrain skills held by people who were fired, in order to achieve the level of performance lost when good people had accidents. Major thought leaders like James Reason began to propose a different view.

System Thinking started to hit mainstream accident investigation in the 1980s. Investigators began to look at work systems, searching for active failures and the absent defenses that allowed accidents to happen. Prevention strategies centered on a deep need to develop defenses in depth. Error management and error traps became popular as people were seen as triggering agents whose actions could be mitigated by finding latent conditions that were lurking in the system, waiting to align when the event was triggered by the active failures. Investigations were designed to identify absent defenses and active failures. Recommendations often pointed to the need to create additional defenses that would block the holes in the “Swiss cheese.” Stronger regulations and tightening of procedures were the natural result, as organizations tried to plug all the holes discovered during their thorough investigations. The System Thinking model resulted in significant improvements in areas like engineering, ergonomics, manufacturing and quality control. It remains an effective tool to improve predictable and stable aspects of our work environment, but questions lingered regarding the human contribution to accidents and incidents.

To this point, our research and experience taught us to learn how to solve specific problems, which created both heroes and villains, based almost solely on the outcome of the solution. Solutions seemed to have limited application as we recognized that not all things were predictable. Greater understanding was being demanded as researchers, like Sidney Dekker, began to point out that people were being named as intentional agents in their own injury, or even demise. Some researchers and practitioners recognized that the causes stated in many investigations (e.g. human error, pilot error) were, as Professor David Woods states, “simply labels that masquerade as explanations.” The agentive language used in most reports was further impeding our ability to learn from events.

As a researcher and practitioner, I began to knit several theoretical concepts together with the help of Professor Sidney Dekker. Leading this line up were complex system research, organizational development, cognitive psychology, social psychology, sensemaking, high reliability organizing and social construction. This happened as I was challenged with the responsibility of conducting wildland fire accident investigations and I found that the traditional tools did not fit my work environment. Wildland firefighting is complex – it is substantively reliant on human interaction and largely absent of technology. Ground firefighting operations have changed little in terms of technology over the last 40+ years and I discovered that it proved to be a great proving ground to explore accidents through the lens of Complex Adaptive Systems (CAS).

Complex systems are, by definition, not fully predictable and as a result, uncertainty is recognized as a natural part of a CAS. Investigations that embrace the complex nature of work begin to look beyond traditional cause and effect relationships by recognizing every work evolution has unique attributes. This makes it difficult if not impossible to generalize and often impossible to predict outcomes by those involved.

Investigators in a CAS are asked to consider why it made sense for people to do what they did. The focus of investigation shifts from judging actions or decisions as right or wrong and people as good or bad, to developing an understanding how worker actions can be tied to a network of influences, or performance shaping factors. When the focus shifts to learning and understanding all we can about what influenced actions, then the way questions are asked and the very language used to describe incidents changes. The desired outcome is that organizations purposefully increase their capacity to learn and workers recognize how important it is to learn their way through work. We add to our prevention toolbox by increasing our ability to recognize novel situations and by learning before, during and after work operations.


[1] The use of the word systems refers to the set of principles or procedures or the prevailing cultural or social orders that guide how we see the world and how things are done.


  1. Ronald Butcher Reply

    Excellent article Ivan! Thank you. Recognizing that we’re somewhat “hardwired” to assign blame. Once blame has been assigned, learning stops and we can all retreat to false assumptions that the event could never have happened to us for the variety of self-protective mechanisms we employ. To my mind, and I think to the important lessons Dr. Dekker shared in his “Field Guide for Human Error Investigations,” the end game of any worthwhile investigation goes directly to learning. If we allow, and that’s the real challenge, ourselves to judge during the course of the investigation, we’ll stop seeing things as they were while we search for information that supports our own confirmation bias. Thanks again for this well-written article on this critically important topic.

    1. Ivan Pupulidy Reply

      Thank you so much for your comment – As a graduate of the Lund Master’s program under Sidney Dekker his work was instrumental in the creation of the Learning Review (LR). The LR emerged from both research and application. The LR Concept could not have gotten off the ground with out the work of professors David Woods, Karl Weick, Reuben McDaniel, Edgar Schein, Erik Hollnagel, John Adams, Scott Page, Edgar Morin, Daniel Kahneman, Hubert and Stuart Dreyfus, Rene Amalberti, Paul Cilliers, and the dedication of the thousands of wildand fire fighters who place themselves in harms way each year.

      For a full list and some additional background on the LR, please take a look at: https://pure.uvt.nl/portal/files/7737432/Pupulidy_The_transformation_01_09_2015.pdf

  2. Charles Tortise Reply

    Thanks Ivan for a great article. An organization that is focused on being the best it can be to be of benefit to all would have safety as a capacity or enabler that must be addressed to the level necessary to enable the business aims to be achieved, on an ongoing basis. They may even consider accidents as inevitable consequences of fallible humans engaging in risky activity and so look to design a system that reduces that risk to acceptable levels and do that through being wise about what it is they do, working not harder nor smarter but wiser. They may even employ a management system that considers the parts, not in isolation, but within the collective whole and so look at managing the gaps. They would probably be actively engaged in learning and do this either after an incident or before through the study of what they were already doing 9,999 times well. Or they may rely on safety being an outcome achieved through imposition of a set of externally derived rules and seek to achieve it through managing the performance of their workers by motivating them with either promises of rewards or the coercion of threat of punishment. Both types of organization may well have a zero accident rate up until the day they suffer an incident but one is far more likely to achieve the aim of learning and it is probably already doing so before they are present with the Post-Accident Report based on Root Cause Analysis of where the human error was along with a set of “Lessons Learnt”.

    1. Ron Butcher Reply

      I like the work and models developed by Nancy Leveson with her STPA/STAMP model that was developed for CAS. You may also want to explore the works of Dekker, Hollnagel and David Woods along with the collaborative work on Resilience Engineering. I also recommend Rob Long’s work on risk. Finally, I can’t over-recommend the mountain of research on cognition, social psychology, group dynamics, etc.

      1. L Benner Reply

        Thank you. I’m aware of Leveson, Dekker and Hoonagle works, but which of Woods papers would you recommend? As for other investigation process research material and references, are you aware of or have you explored the http://www.iprr.org web site?

        1. Ivan Pupulidy Reply

          There are a number of books that David has contributed to – with regard to the management of complex systems you might try Hollnagel’s test: being ‘in control’ of highly interdependent
          multi-layered networked systems David D. Woods • Matthieu Branlat

          I would also recommend – Reuben McDaniel’s work a great article to share with a variety of audiences would be:
          Management Strategies for Complex Adaptive Systems
          Sensemaking, Learning, and Improvisation
          Reuben R. McDaniel, Jr.

          Here is another solid article:
          The complexity of failure: Implications of complexity theory
          for safety investigations Sidney Dekker a,⇑, Paul Cilliers b, Jan-Hendrik Hofmeyr b

    1. Ron Gantt Reply

      Perhaps someone could explain to me (or point me in the right direction to do my own research) of the functional (not necessarily theoretical) differences between ‘complex adaptive systems’ and ‘wicked problems’? It seems to me that although the concepts use some different (and some similar) terminology, they both lead to similar recommendations. I am probably missing something though.

      If I tackle two problems, one using a ‘complex adaptive systems’ approach and one using a ‘wicked problems’ mindset, what would be the differences in what I do?

      Thanks in advance!

      1. Ivan Pupulidy Reply

        Take a look at the articles by Reuben McDaniel on CAS – he poses an interesting idea that in real time, complex systems require an approach that differs from problem solving. If a system is truly complex, then actions in that system can be seen as experiments rather than problem solving exercises. If we don’t know what the outcome of actions/decisions will be then we should be poised to watch carefully when we do things. The conceptual approach of an experiment, predisposes us to look for unexpected changes. This fits in the sensemaking, learning and improvisation model.

      2. Rob Long Reply

        Hi Ron, neither are even close to each other. Try Conklin http://cognexus.org/wpf/wickedproblems.pdf as a soft start and maybe http://www.apsc.gov.au/__data/assets/pdf_file/0005/6386/wickedproblems.pdf My colleague Criag Ashhurst is completing his PhD at ANU on Wicked Problems. He and I do workshops in wicked problems and implications for risk and safety in a very practical way. It certainly isn’t about difference in language, the whole discourse is foundationally and philosophically different. If you write: rob@humandymensions.com I can share some stuff from our training. We also have videos on the Centre for Leadership and Learning in Risk but they are reserved for our online learning program. Craig is the Director of Studies for the Centre and we have been working together for 30 years.

        1. Ron Gantt Reply

          Thank you Ivan and Rob. I think in perusing the resources you mentioned I am more convinced of a huge amount of overlap in the two concepts. Here’s some examples, first from the APSC paper:

          1. Wicked problems are “difficult to clearly define”, which seems to be a feature of complexity. As the paper notes, different people have different perspectives, which seems aligned with what Cillier notes as a feature of complex systems – that no one person in the system is able to fully understand and define the system (or else they would have to be as complex as the entire system).

          2. Wicked problems have many interdependencies and are often multi-causal. This is clearly a feature of complex systems, as they are defined by the relationships between components, rather than the components themselves.

          3. Attempts to address wicked problems lead to unforeseen consequences. This is another feature of complex systems, as people acting within the system in locally rational ways lead to system level behavior that no one wants. This is similar to what Dekker calls “drift into failure”.

          4. Wicked problems are often not stable. Complex systems are dynamic, hence the “adaptive” label that is attached to it sometimes. This makes problem solving difficult or impossible, because by the time the problem is recognized and understood the system has changed, so your understanding is not longer valid, making traditional problem solving approaches difficult, impossible and/or counter productive.

          5. Wicked problems have no clear solutions. This is one difference between complex systems and wicked problems, but that’s merely in defining the unit of analysis. Whereas complex systems is merely a description of a set of relationships, wicked problems is a description of problem that people wish to solve. So there’s a normative aspect to wicked problems that isn’t there for complex systems, which is more descriptive.

          6. Wicked problems are socially complex. It seems to me that the terminology used to describe “socially complex” in the APSC paper and the Conklin paper are a different term for complex adaptive systems. They seem to describe the same thing. It’s interesting though that wicked problems see social complexity as an aspect of wicked problems, whereas it seems to me that wicked problems are a by-product of complexity (i.e., you get wicked problems because we live and operate in a complex system).

          7. In discussing what to do about wicked problems, the APSC paper advocates “collaborative strategies” and “innovative” and “flexible” approaches. This sounds remarkably similar to what McDaniels advocates for management in complex adaptive systems – sensemaking, learning, and improvisation, rather than traditional linear, authoritarian approaches. Others have advocated increasing collaboration and taking the emphasis off of top down approaches to help deal with complexity at a strategic organization level. Schein calls for dealing with complex problems through “adaptive moves” that are not designed to solve problems, but rather to perhaps help with he problem and, importantly, provide more understanding of the system to give insight regarding future moves.

          I could go on, but I hope this clarifies my confusion. From what I’ve seen regarding wicked problems and complex systems (or complex adaptive systems or complex sociotechnical systems) I see them as inherently related and leading to very similar recommendations. Sure there are people who use the terms incorrectly or misunderstand the concepts, but at their core they seem related. I personally believe that wicked problems are a feature of complexity, as complex systems create irreducible goal conflicts.

          However, I could be wrong. Please let me know where I’ve gone astray.

          Thanks in advance!

          1. Ivan Pupulidy Reply

            Hi Ron,

            Great comparison between Wicked Problems and Complex Adaptive Systems approaches. Thank you for adding so much value to this conversation!

          2. Rob Long Reply

            Yes Ron. They are related but differ in degree with regard to the capability of solution and outcome. Language is critical here because the notion of complexity doesn’t convey the idea of intractability and unsolvable paradox ie. can never be fixed indeed, can only get more wicked.

            1. Ron Gantt Reply

              Ok, that makes sense. So the difference isn’t in the concepts themselves, but in how the concepts are framed cognitively because of the terminology that is used, i.e., the effect what we call it has on us?

              I would imagine at least some of this is dependent on the individual and perhaps even the culture. After all, when I hear ‘complexity’ I immediately think of intractability, paradox, unpredictability, uncertainty, etc. I can see how others would conflate ‘complexity’ with ‘complicated’ though.

              There might be cultures where these terms have different meanings that help or hinder the understanding of the concept as well. One example of this is the term “wicked” in Boston here in the US is used in a very unique way as an adjective usually describing greatness or substituting for the word “very”, as in “he’s wicked smart” (see the movie “Good Will Hunting” for examples of this). There someone may see ‘wicked problems’ as less problematic given the familiarity with the term possibly (just a hypothesis here). So there’s some variability I imagine.

              1. Ivan Pupulidy

                Great conversation – the concept of language is certainly important. I gravitated to CAS because the language was compatible with my thinking. My perspective is not that important; however, what I realized was that the language in CAS facilitated learning from the event. CAS also helped us to move to a less biased place by accepting that things in a complex system can remain outside our control. Removing control from the conversation can be challenging for a lot of people – especially firefighters and pilots – but the realization is humbling and in that humility opportunities to learn emerge naturally.

              2. Rob Long

                That’s it Ron. We can’t separate the language, discourse and semiosis of something from its expression in the semiosphere (ie. the world understood as signs and symbol systems). The humiliation of ‘the word’ in safety of course is so pronounced that language is made into nonsense eg. zero harm. There is also clear historical of how change in language changes culture eg. the word ‘gay’ once meant happy. Despite this, we ought to be able to insist on the importance of definition in terms eg. complex systems that can ‘adapt’ (axelrod) and wicked problems that ‘mal-adapt’. If only Safety were better educated in these things we wouldn’t see so much technicist and mechanistic ideology flooding the safety airwaves.

  3. Rob Long Reply

    Ivan, if we acknowledge the presence of risk and safety as a wicked problem that changes our methodology of investigation, something I teach many people to do. The change in methodology (philosophy) then creates a more holistic method where people-as-humans in social-psychological relationship are recognized. When I look at all the models of investigation on the market like tap root, icam and bow tie etc, none of them even come close to being holistic in any sense of the word. Most just maintain the myth of objectivity and technicist assumptions of truth.

    1. Ivan Pupulidy Reply

      Hi Rob,

      It seems we have a great deal in common. Facing the recognition that the tools we had for investigations did not meet our needs (largely for the reasons you point out), we developed our own process called the Learning Review (LR). In 2014 the US Forest Service replaced the Serious Accident Investigation process with the LR. The LR represents what you seem to be asking for – it is not based on cause (except in mechanical analyses). Rather it is focused on understanding the context of action by mapping influences on our personnel. This is a rough description if you are interested in more information I would be happy to discuss this with you.


      1. Rob Long Reply

        Like the emphasis on learning, the most important word missing in most safety discourse also the emphasis on mapping. Do a similar thing in my SEEK training using social psychology and a dialogical mapping method I developed based on values-based systems dynamics.

  4. Zinta Satins Reply

    I can’t remember the last time that I enjoyed reading through a comments thread as much as I did this one. In particular thank you Ron, Rob and Ivan for fostering the conversation.

  5. Andy Shone Reply

    Thanks for the article Ivan! Your diagram shows Reason as a Systems Thinker, Reasons work certainly led to a focus on management systems but I wouldnt consider him a “system thinker”. Interested in your thoughts.

Leave a Reply

Your email address will not be published. Required fields are marked *