The use and abuse of ‘human error’

“Oh my God. I told those guys at safety that it was dangerous and one day we would lose concentration and pay for it. I already told those guys at safety that it was very dangerous! We are human and this can happen to us. This curve is inhuman!”

These are the distressed words of the injured train driver moments after the train derailment in Santiago de Compostela, northern Spain on 25 July 2013. The driver can be heard pleading in sorrow, hoping for the safety of the passengers, “I have turned over. My God, my God, the poor passengers. I hope no-one is dead. I hope. I hope.” Seventy-nine people died.

In the aftermath of the accident, initial investigations ruled out mechanical or technical failure, sabotage and terrorism. That appeared to leave only two possible explanations – ‘human error’ or ‘recklessness’, or both. When society demands someone to blame, the difference – whatever it might be – can seem trivial. What followed was a display of our instinct to find a simple explanation and someone to blame. Soon, the explanation and the blame pointed to the driver. The Galicia regional government president Alberto Nunez Feijoo stated that “The driver has acknowledged his mistake“. Meanwhile, Jorge Fernandez Diaz, Spain’s Interior Minister, said that there “were reasonable grounds to think he may have a potential liability” and confirmed he could face multiple charges for reckless manslaughter. While safety investigations are ongoing, the driver faces preliminary charges of 79 counts of homicide by professional recklessness and numerous counts of bodily harm.

Several claims appeared about the driver in the media, often without relevant context. It was reported that the driver “admitted speeding” on the occasion of the crash [1]. It is known that the train was travelling at twice the speed limit on the curve and that just before the crash. The train’s black boxes showed that the train was travelling at 192 kph moments before the crash. The speed limit on the curve was 80 kph. The implication was that the speeding was reckless. The media pounced onto an old Facebook post reportedly by the driver, over a year ago, of the speeds at which his trains would travel. One post, reported by Spanish media and attributed to the driver, stated: “It would be amazing to go alongside police and overtake them and trigger off the speed camera“, accompanied by a photo of a train’s speedometer at 200 km/h (124 mph). This may be an unwise social media post, but such speeds are normal and fully permitted on the high-speed line sections.

However, there appears to be no evidence that the ‘speeding’ involved conscious disregard for, or indifference to, the dangers of the situation or for the consequences of his actions. This would have been an extreme act. Rather, it seems that the driver was unaware of the context. This hypotheses invoked ‘human error’ explanations, though carelessness was implied. It was reported that the driver himself told the judge that he was distracted and suffered a “lapse of concentration” as he approached the curve[2]. Just minutes before the derailment, the driver received a call on his work phone. The ticket inspector told El Pais that he had called the driver to instruct him to enter an upcoming station at a platform close to the station building to facilitate the exit of a family with children. The call lasted nearly two minutes; a long time when you are travelling at 192 km/h. Renfe employees are not allowed to use phones except in case of emergency, but ticket inspectors have no access to the train cab. The driver told the court he lost a sense of where the train was during the call, and believed he was on a different section of the track. It was also reported that the “driver got warnings before crash” [3], having received three warning signals. By the time he had engaged the train’s brakes, it was too late.

tapeAs is common in accidents and incidents, front-line staff immediately blame themselves, which does not mean they are to blame. Spanish press stated that immediately after the derailment, the driver allegedly said to officials at the railway station 3km from the crash “I ****** up, I want to die. So many people dead, so many people dead” [4].

In this case, the justice system will now need to determine if the driver’s actions crossed the line into ‘recklessness’. It is another issue as to whether or how justice will be served. But one only needs to look into the context of this accident to see that ‘human error’ or synonyms such as ‘lapse of concentration’ or even ‘carelessness’ do not seem reasonable to explain this terrible event. And if that is all it takes for such an outcome, then it could surely happen again. The ‘human error’ explanation does not seem to serve safety, so what does it serve? Perhaps it partly serves society’s need for simple explanations and someone to blame, while absolving society itself for its demands.

 

Human error or an inhuman system?

Shortly before the train crashed, according to reports, the Spanish train had passed from a computer-controlled area of the track to a zone that requires the driver to take control of braking and deceleration. Furthermore, there was no automatic braking system on the curve in question. The European Rail Traffic Management System automatic braking program was installed on most of the high-speed track but stopped 3 miles south of where the crash occurred. This placed responsibility on the driver significantly to reduce speed at a crucial time. The sharp bend was known to be “dangerous” and has previously been subject to debates and warnings. According to Spanish journalist Miguel-Anxo Murado, “There were arguments for having that section of the route remade completely, but Galicia’s particular land tenure regime makes expropriations an administrative nightmare. So the bend was left as it was, and speed was limited there to 80km/h.” The driver’s recorded phone call indicated that he knew this and had foretold such an accident in a warning to the company’s safety specialists: “I already told those guys at safety that it was very dangerous. We are human and this can happen to us. This curve is inhuman.”. The judge is now reportedly expanding the preliminary charges to include numerous top officials of the state railway infrastructure company, Adif, including rail safety senior officials, for alleged negligence [5].

Reminiscent of the Chernobyl inquiry, a small number of media reports broadened the focus to what might be called reckless expansion in society more generally: “I can’t help feeling that, at some profound or superficial moral level, we also played our part in the tragedy as a society; that this was the last, most tragic episode of a decade of oversized dreams, fast money and fast trains”, said journalist Miguel-Anxo Murado [6]. If this stretches the argument, it at least gives a counterbalance to the ‘human error’ or ‘recklessness’ explanations of this tragic event.

 

The psychology of error / The error of psychology

There are thousands of pages of research in the psychology and human factors literature on the issues mentioned so far. The ‘reversion to manual’ problem has been studied extensively in the context of automation and manual operation. The distracting effects of phone calls – hands free or not – are well-documented. ‘Multitasking’ is known to have devastating effects on performance, yet conflicts between safety and efficiency goals often demand that we switch from one task to another in a given timeframe. There are thousands of articles on situation awareness along with many books. The same is true of safety culture, including how organisations respond to safety concerns.

But ‘human error’ has been a fascination of psychologists for over a hundred years. Psychology is a scientific discipline concerned the mind and behaviour, and therefore tends to have an individual or social focus. For decades, human mishaps have been dissected and further dissected into multiple categories in the scientific literature. Well-known names including James Reason and Don Norman were early pioneers of the study of error, and developed psychological explanations for slips and lapses via individual diary and laboratory studies (Reason, 1979 [7]; Norman, 1981 [8]). Mistakes were subsequently studied, and ‘violations’ followed (see Reason’s landmark ‘Human Error’, 1990 [9]). Human factors (or ergonomics), meanwhile, is a design discipline concerned with interactions in socio-technical systems. Knowledge concerning people and complex safety-critical systems has been applied to real systems in most industries to avoid, reduce or mitigate human error.

Indeed, the popularisation of the term ‘human error’ has provided perhaps the biggest spur to the development of human factors in safety-related industries – with a downside. When something goes wrong, complexity is reduced to this simple, pernicious, term. ‘Human error’ has become a shapeshifting persona that can morph into an explanation of almost any unwanted event. It is now almost guaranteed to be found in news stories pertaining to major accidents. Interestingly, some reports specify that human error was not the cause. The reverse implication being that human error would otherwise have been the cause (e.g. “Paris train crash: human error not to blame”, Telegraph, 13/07/13). Since the term suffices as explanation, little or no mention of findings in psychology or human factors, including the context and conditions of performance, is required.

This is very unsatisfactory to many psychologists; the implication in research and practice was that human error is ‘normal’ – it is part of who we are. Similarly, it is very unsatisfactory to many human factors specialists who try to predict design for error. But in the context of safety and in justice, ‘human error’ has been taken to mean something different – a deviation from normal, from rules, procedures, regulations and laws.

 

The demise of error

Despite decades of research, there has been little agreement on the precise meaning of the term, and more recently whether it has any real meaning at all. While the term may have some value in simple systems and situations, there are problems with the use of the term in complex systems such as ATC. These are now well documented in the literature. While ‘human error’ is still the explanation of choice for accidents, the term itself fell into disrepute among some thinkers more than a decade ago [10, 11].

After being fascinated by the concept of human error since encountering it while studying psychology in the early 1990s, I gradually and reluctantly accepted these arguments in the first few years of the 2000s. Reading the works of Erik Hollnagel, Sidney Dekker, David Woods, Rene Amalberti and others, I grew increasingly uncomfortable with the concept and term. This inconveniently coincided with the final stages of a PhD in human error in air traffic control. My own realisation finally crystallised when reviewing Erik Hollnagel’s book ‘Barriers and accidents prevention’ in 2004 [12]. I committed to abandoning the term. My own reasons followed the arguments of those mentioned above (presented to the Safety and Reliability Society in 2006).

  • ‘Human error’ is a often a post hoc social judgement. ‘Human error’ is one of few things that often cannot be defined unambiguously in advance of it happening.
  • ‘Human error’ requires a standard. To know that something is an error, it must be possible to describe a non-error. This can be surprisingly difficult, partly because there are so many “it depends”. In the context of complex interacting systems such as ATC, there are many ways to get an acceptable result.
  • ‘Human error’ points to individuals in a complex system. In complex systems, system behaviour is driven fundamentally by the goals of the system and the system structure. People provide the flexibility to make it work.
  • ‘Human error’ stigmatises actions that could have been heroic in slightly different circumstances. What are described as heroic actions could often have been described as tragic errors if the circumstances were only slightly different. The consequences of heroic actions are not known in advance.
  • Underlying processes of ‘human error’ are often vital for task performance. In the context of error, we often refer to psychological activity involved in perception, memory, decision making or action. Taking one example, without expectation, radio-telephony would be very inefficient. Occasionally, one may hear what one expects instead of what is said, but this must be set against improved efficiency during thousands of other occasions.
  • ‘Human error’ is an inevitable by-product of the pursuit of successful performance in a variable world. The context and conditions of performance are often vague, shifting and suboptimal. The ability to adapt and compensate comes at a cost.

Still, the term ‘human error’ is used frequently in human factors and psychology, and practitioners reside in several camps. The first camp continues to use the term with ‘good intent’ and add caveats that human error is normal and we need to talk about it in order to learn from it. But in doing so, they risk sounding like Humpty Dumpty in Lewis Carroll’s ‘Through the Looking Glass’ (“’When I use a word,’ Humpty Dumpty said, in rather a scornful tone, ‘it means just what I choose it to mean — neither more nor less.’”). A related second camp appears reluctantly to use the term for convenience but at the same time rejects the simplistic concept and argues that the term refers to a symptom of deeper organisational troubles. A third camp has abandoned the use of the term, except reflexively, to refer to the term itself. This latter camp perhaps recognises that the term itself is damaging. Personally, I have moved from camp one, through camp two, and finally to camp three. While psychology and human factors did not intend some of the simplistic meanings ascribed to the term, the genie is out of the bottle.

 

Words shape worlds

Does it all matter, if we still use the term ‘human error’ when we know what we mean? Do we risk falling onto a euphemism treadmill, skipping from one term to the next? [13] The argument presented here is that it does matter. Our language affects the way we view the world and how we approach problems. Even if we know what we mean when we talk about ‘human error’, and even if it does seem to fit our everyday slip-ups and blunders in life, the term reinforces unwanted connotations, especially when we are talking about high-hazard systems. While we cannot put the genie of human error back in the bottle, we can use a new vocabulary to create a new understanding.

Left with a ‘human error’-shaped hole in my vocabulary several years ago, I found an alternative concept thanks to Erik Hollnagel: performance variability. This is not simply a replacement term or a euphemism, but a new way of thinking that acknowledges how systems really work. Performance variability, both at an individual level and at a system or organisational level, is both normal and necessary, and it is mostly deliberate. What controllers actually do varies, because it has to. We have to make efficiency-thoroughness trade-offs, as well as other tradeoffs. This flexibility is why humans are required to do the job. Also, people naturally have different preferred styles of working and there are several ways to do the same job. There is of course some leftover unwanted variability – you can’t have without the other. But without performance variability, success would not be possible. It is not the aim of this article to explain this in more detail, but the reader is encouraged to explore this further (see Hollnagel, 2009).

More generally, if we wish to understand how systems really work, and improve how they work, we need to enrich our vocabulary with systems concepts – and use them in preference of simplistic terms that don’t help explain how systems actually function. This is not to say that people are not responsible for their actions – of course they are. What is relevant is the difference between normal variability in human performance, and what we define as recklessness. Labeling either as ‘human error’ is not helpful.

 

Folks, it’s time to evolve ideas

Human error’ has long outlived its usefulness in human factors, safety and justice. We can’t expect society to change the way it thinks and talks about systems and safety if we continue in the same old way. It’s time to evolve ideas and think in systems, but for that to happen, our language must change. Overcoming ‘human error’ in our language is the first hurdle.

 

Further reading

Dekker, S.W.A., (2006). The field guide to understanding human error. Ashgate.

Hollnagel, E. (2009). The ETTO principle: Efficiency-thoroughness trade-off. Ashgate.

Meadows, D. (2009). Thinking in systems. Routledge.


[1] Spain train crash driver admits speeding in emergency call recording, Telegraph, 06/09/13
[2] Spain train crash: Driver told judge he was ‘distracted’, Telegraph, 06/09/13
[3] Spanish train wreck driver got warnings before crash, Reuters, 02/0813
[4] ‘Reckless’ Train Crash Driver Held By Police, Sky NEws, 26/07/13
[5] Train crash judge summons track safety managers, Leader, 10/09/13
[6] Spain train crash: human error over decades, not just seconds, Guardian, 25 July 2013
[7] Reason, J. (1979). Actions not as planned: The price of automatization. In G. Underwood & R. Stevens (Eds.), Aspects of consciousness: Vol. 1. Psychological issues. London:Wiley.[8] Norman, D.A. (1981). Categorization of action slips. Psychological Review, 88, 1–15.
[9] Reason, J. (1990). Human error. Cambridge University Press.
[10] Hollnagel, E. and Amalberti, R. (2001). The Emperor’s New Clothes, or whatever happened to “human error”? Invited keynote presentation at 4th International Workshop on Human Error, Safety and System Development. Linköping, June 11–12, 2001.
[11] Dekker, S.W.A., (2006). The field guide to understanding human error. Ashgate.
[12] Hollnagel, E. (2004). Barriers and accidents prevention. Ashgate.
[13] This risk, and the comparison with terms for disability, was pointed out to me by a human factors colleague, which prompted this article.

Post adapted from: ‘Human error’: The handicap of human factors, safety and justice http://humanisticsystems.com/2013/09/21/human-error-the-handicap-of-human-factors-safety-and-justice/

 

 

9 Comments

  1. Sidney Dekker Reply

    Hi Steve, beautifully captured — your autobiographical angle (from essentialist researcher to constructionist scientist concerned with the ethics and justice of what we do) gives it all the more “oomph,” genuineness and credibility. I like it a lot. The next edition of “The Field Guide to Understanding Human Error” will likely not have “Human Error” in its title!
    Sidney

  2. Chris Kelly Reply

    Interesting article Steve. However, I think ‘human error’ still has some validity and usefulness and I wouldn’t abandon the concept entirely. 🙂
    Also, in fairness to Reason, he’s changed his views over the years and also moved towards the performance variability camp (eg Revisiting the SCM written with EH).
    Chris

  3. Sue Milner Reply

    Hi Steve, Great piece of prose at just the right level. I have been feverishly directing many of my clients towards it, so thank you.
    Human Error as a term and (unspecified) concept is really deeply entrenched and I find it very hard to persuade against its use – but that hasn’t stopped me trying. If as Chris says Reason has moved towards Erik’s performance variability terminology, it would be fantastic if he would follow Sidney’s lead and publish updates to his books, which are still considered the bible in my (Rail) field. Sadly I don’t think that will happen.
    What I am trying to say is keep producing posts like this to provide me with more ammunition to support my arguments.
    Again thank you

    Sue

  4. Al Ross Reply

    Steve

    It does appear this is resonating with people – keep banging the drum! I remember Flach’s call for the vocabulary of co-ordination and control but in practice error has probably been too seductive as a way to engage people (say, doctors) with HF…

    It is sad to say that your piece reminds me directly of something I wrote about a rail crash a long time ago:

    On the day of the Ladbroke Grove rail crash in October 1999, it was immediately reported that driver Hodder (who tragically died in the crash) had passed signal SN 109 at danger. Within a matter of days it was widely noted in the press that he had been convicted of assault in 1998 (for a minor offence in respect of which he was given a conditional discharge). The clear motive for such reporting was to encourage the notion that the tragic events might be attributed to driver Hodder’s personal characteristics. What price a front page article on the personal social histories of the Thames Trains managers, those responsible for reviewing signal SN 109 (which had been passed at danger on 8 previous occasions); the signallers; the consultants involved in reviewing implementation of Automatic Train Protection in the Thames Trains fleet? It may be argued that, in such cases, no amount of contributory system factors, discussed in subsequent inquiries, eroded a primary impression that blame lay with the driver. How many people were still interested by the time Counsel to the Ladbroke Grove inquiry, Robert Owen QC, concluded that driver Hodder’s criminal record ‘does not appear to have any bearing on the causes of the collision’?

    Davies, J.B., Ross A.J., Wallace, B. and Wright, L. (2003) Safety management: A qualitative systems approach London: Taylor and Francis; ISBN 0-415-30371-0.

    I blame psychology:

    […] old-fashioned cognitivist psychology has not provided the insights it promised. It has pointed attention in the wrong direction: toward the alleged cognitive mechanisms which generate error, instead of the social world in which we all actually live and work.

    Wallace, B. and Ross A.J. (2006) Beyond Human Error: Taxonomies and Safety Science Boca Raton, Florida: CRC Press/Taylor and Francis; ISBN 0849327180.

    Al

  5. John Wilkinson Reply

    Great article Steve – I am only just coming to terms with this argument but your explanation and background have helped me get there. However we may need to rethink this for press/media/public consumption since no-one on e.g. the Today programme (UK main morning radio news programme) is likely to throw this term around and is likely to immediately ask anyone using it for a simple clarification. One term that has started to creep in is ‘honest error’ but of course this implies there are dishonest ones and that isn’t what we mean by ‘violations’. I suppose, reading the article, one way of capturing the variability side is to talk about flexibility – a major plus in making complex systems work and a major minus when the system complexity makes it misfire and produce errors. using ‘efficiency-thoroughness trade-offs’ is good from the science viewpoint (we can pin donw what it means so we can test it but is not exactly a simple term to bandy about. so maybe people are the lubricating oil in the system but if the system gets too hot around them, the oil clags up and creates unpredictabale failures across the system….hope this is food for thought (and that it hasnt already been thought as it were!). If we want better and fairer explanations in the public arena we need simple agreed language and arguments. John

  6. Lucia Reply

    Hi Steve,
    Very engaging article. It has given me some food for thought.. especially about how this human error label has perhaps inadvertently influenced our subsequent management of incidents and/ injuries. Thanks for sharing.

  7. Steven Shorrock Post author Reply

    Many thanks too all for the comments.

    One thing I am mindful of is where the term ends up. In incident reports, in some countries, the term human error can be taken to imply culpability in the justice system. I recently saw a presentation by an air traffic controller in one European country who spoke about exactly this, and was considering using language and concepts from the justice system in incident reports in order to prevent it!

    John – I have always found that operational staff immediately understand the concept of performance variability, and ETTO – it is actually intuitively obvious. I have the same misgivings over ‘honest mistake’ – though it is used in defintions of just culture. Partly what I like about performance variability is exactly what you suggest – in and of itself, it says nothing except that performance is variable. There is no value judgement attached, and if people say “What do you mean?” then that is perfect. With “human error” people think they know what is meant, but think of different things – slip, lapse, mistake, blunder, carelesslness, inattentiveness, negligence…

    Al – you were, and are, spot on! Your and Brendan’s work changed the way I looked at taxonomy (I seem to remember reviewing it for a journal!), and made its way into a former PhD student’s research:
    Evaluation of the HFACS-ADF safety classification system: Inter-coder consensus and intra-coder consistency
    http://www.sciencedirect.com/science/article/pii/S0001457509002334
    Coding ATC incident data using HFACS: Inter-coder consensus
    http://www.sciencedirect.com/science/article/pii/S0925753511001184
    Reliability studies of incident coding systems in high hazard industries: A narrative review of study methodology
    http://www.sciencedirect.com/science/article/pii/S0003687012001007

Leave a Reply to John Wilkinson Cancel reply

Your email address will not be published. Required fields are marked *