Is it 1947 yet?

broken_iceNeither Lieutenant Nathan Poloski’s body, nor his F/A-18 Hornet were ever found in waters almost three miles deep. All that was located in the Western Pacific after his fighter jet collided with another from the same aircraft carrier were his helmet and some pieces of debris. The pilot of the other jet ejected safely and was rescued shortly after.

The Navy accident report, all of eight pages long, was acquired by the New York Times under a Freedom of Information Act request.1 What remains a mystery is what exactly caused the accident, the report suggests. It was a clear afternoon with good visibility. Both pilots were healthy, properly rested and under no unusual stress. There were no mechanical problems with either aircraft.

And so what is left, you wonder?

One of the first extensive published research studies into “human error,” in 1947, put the label in quotation marks.2 Paul Fitts and Richard Jones, building on wartime pioneering work by engineering psychologists such as Alphonse Chapanis, had been wanting to get a better understanding of how features of pilots’ tools and tasks influenced the kinds of errors they made. Using recorded interviews and written reports, they built up a corpus of accounts of ‘pilot errors.’ They found that these ‘errors’ came from somewhere, that these were assessments and actions that made sense at the time. That was 1947.

Would the Navy’s current top aviator, Vice Admiral Mike Shoemaker, himself an F/A-18 pilot, have read Fitts and Jones? The two pilots involved in the midair, he opined in reflections on the report when closing the investigation on April 20th this year, should have exercised more of what his military calls “situational awareness, or S.A.” In this case, it would have meant not relying only on cockpit instruments but looking outside “to spot a looming catastrophe.”

Perhaps the Vice Admiral did read Fitts and Jones. Practically all Army Air Force pilots, Fitts and Jones had found, regardless of experience and skill, reported that they sometimes made ‘errors.’ While the eight-page report had originally only admonished (the dead) Poloski for losing S.A., the Vice Admiral broadened it to both pilots. He didn’t call what they did (or did not do) “pilot error,” he did call it a lack of situational awareness.

Is it 1947 yet?

Fitts and Jones called their paper Analysis of factors contributing to 460 “pilot-error” experiences in operating aircraft controls. Again, “pilot error” was in quotation marks, in the very title of the paper—denoting the researchers’ suspicion of the term. And it didn’t stop there: they even used the prefix “so-called.” This is how their paper opened: “It should be possible to eliminate a large proportion of so-called ‘pilot-error’ accidents by designing equipment in accordance with human requirements.” How much clearer would they’ve had to get? And it wasn’t just equipment. Subsequent human factors research extended the study of context to operational and organizational factors.

On September 12th, 2014, Poloski had been on a practice bombing mission. With 221 hours on the Hornet, he was less experienced than the other pilot, a Navy commander, who had taken on the plane. At 7,000 feet, Poloski turned west and slowed to about 300 miles an hour. Poloski’s jet caught up with the other plane and impacted its bottom left rear.

Did anyone ask sufficiently probing questions about the context in which his actions made sense? Poloski had told his mother shortly before deployment that he was looking forward to the mission. He was not about to go die in some accident. It turned out that he was not aware that the other pilot had chosen the same route. And that controllers on the carrier were occupied with landing aircraft. But “while there is no definitive evidence to suggest either pilot’s S.A. or lack thereof directly contributed to this incident, greater S.A. by all parties may have prevented the collision,” the Vice Admiral concluded.

Fitts and Jones did not call the episodes they studied “failures,” or talked about them in reference to any slippage from some implicit norm or standard (like “greater S.A.” Sure. Greater than what?). Instead, Fitts and Jones used the neutral term “experiences” in the write-up of their research results.

Is it 1947 yet?

The point, for Fitts and Jones, was not the “pilot error.” That was just the symptom of trouble, not the cause of trouble. It was just the starting point. The remedy did not lie in telling pilots not to make errors (or tell them not to lose S.A.). Rather, Fitts and Jones argued, we should change the tools, fix the operational and organizational environment in which we make people work, and by that we can eliminate the errors of people who deal with those tools. Skill and experience, after all, had little influence on “error” rates: getting people trained better or disciplined better would not have much impact. Rather change the environment, and you change the behavior that goes on inside of it. In this report, the focus is firmly (to speak with Don Norman) on what might have been (or not been) in the heads of the pilots, rather than in their world. The focus was on the pilots, not on the context.

Is it 1947 yet?

Is this an instance of the U.S. military unlearning or disinheriting the key things that Fitts and Jones taught them more than half a century ago? Today’s investigations, media and others frequently come up with new labels for ‘human error’ (“lack of situational awareness”). And they stop when they have satisfied their desire to blame the frontline operator for their “failures” to do this, that, or the other thing, or for lacking something which, in hindsight, is so obvious to show. The “catastrophe,” after all, was “looming.” All you needed to do was look up.

Is it 1947 yet? Then let’s start putting “human error,” by whatever name, in quotation marks. Because it is merely an attribution after the fact.

Is it 1947 yet? Then understand that this attribution is the starting point, not the conclusion of an investigation.

Is it 1947 yet? Then, like Fitts and Jones, let’s not use the word “failure” or “lack,” but rather something like “experience” to describe the episode where things went wrong.

Based on the helmet investigators found—which had a big crack in it, extending from the bottom right side up to the crown with a hole halfway—they concluded that Poloski must have suffered massive, fatal head trauma. We may surely hope it was swift. But to then admonish someone for a “lack of S.A.” whose perspective we never shared because we weren’t there, and with whom we’ll never be able to talk again?

Is it 1947 yet?

We could all learn a lot from the insights of Fitts and Jones, from their understanding, their open-mindedness and their moral maturity. Here’s two suggestions to Vice Admiral Shoemaker, and to many others who might feel tempted to blame the dead. First read Fitts and Jones, 1947. And then don’t write in a report what you wouldn’t say to the face of a mother who has just lost her 26-year old son.

  1. Schmitt E. Navy pilot’s death reflects hazards of job. International New York Times 2015 May 13;6.
  2. Fitts PM, Jones RE. Analysis of factors contributing to 460 “pilot error” experiences in operating aircraft controls Dayton, OH: Aero Medical Laboratory, Air Material Command, Wright-Patterson Air Force Base, 1947.



  1. Ron Butcher Reply

    Excellent points and perspective from Dr. Dekker. B.F. Skinner started us on the right path when he described how people behave rationally within the context of their environment. We are now learning, through Behavioral Economics, that we often behave irrationally, but within a pattern of consistency. What Dan Ariely described as “Predictably Irrational.” I suppose it’s comforting to think that we have all the answers, particularly when our human biases of hindsight and attribution do most of the work of fulfilling our expectations. I’m reasonably sure that Admiral Shoemaker wouldn’t make such a statement to the mother of the deceased. I do, however, think that Admiral Shoemaker is concerned with keeping the remaining F-18 pilots in the air and that becomes significantly harder to do without an explanation for why one of their peers lost his life while doing everything reasonably within his control to do.

    1. Wynand Reply

      Just one small comment – maybe the better term to use for “irrational” is “arational”, meaning neither rational nor irrational. According to Dr Robert Long (and others, I believe), most of our decisions actually fall in the “arational” category.

      1. Ron Butcher Reply

        Would this category imply that the subconscious mind guides us without reason? I’ll admit to struggling with the concept. While I agree that many of our “decisions” are not made in a conscious consideration of all the available information, I do believe the development of our heuristics, schemas and even the occasional bias are based largely within that gradient spectrum of rational to irrational. Perhaps my own ignorance revealing itself but color me a doubter.

        1. Rob long Reply

          Ron, there is much to learn in the space which the safety industry is incredibly illinformed and silent on. The sub-conscious was Freuds pejorative word for the non-conscious, Jung was much more comfortable with the unconscious. The social shaping of the collective unconscious is a whole new ball game for safery that hugely limited by being fixated in a regulatory and engineering worldview.

    2. William R. Corcoran, PhD, PE Reply

      Dekker’s last paragraph above included:
      Here’s(sic) two suggestions to Vice Admiral Shoemaker, and to many others who might feel tempted to blame the dead. First read Fitts and Jones, 1947. And then don’t write in a report what you wouldn’t say to the face of a mother who has just lost her 26-year old son.

      How often does our cultural taboo against blaming the victim keep us from understanding the victim’s missed opportunities to have averted the harm? Do we need to look out for this?

      Similarly, does the “mother test” create a chilling effect on investigators?

  2. William S. Brown Reply

    “In Naval Aviation, Situational Awareness (SA), or the lack thereof, can prevent or cause mishaps.” – Vice Adm. Shoemaker
    “It’s a construct! Constructs cannot cause anything!” – Dr. Charles E. Billings
    Those of us ‘in the business’ are obliged to resist the reification of our shoptalk. Dekker’s suggestion about quotation marks is a good start.

  3. Paul Nelson Reply

    When talking about and using the term “situation awareness” (SA), remember it is a Hindsight Bias term. Secondly it is a psychological construct, not something that intrinsically exists out-there. Therefore, how can it be logical to admonish an increase in SA? Furthermore, at any given moment, a person is aware to the maximum extent believed necessary and possible. If there is to be any “increase” in “SA” it is only possible after some sort of locally perceived feedback, recognized as necessitating adjusting sensitivity to a recognized stimulus. That adjustment may incidentally decrease sensitivity to some other perceived unnecessary environmental facet.

    We can continue to wish that humans would achieve or consistently operate within a narrower band of tolerance or accept what reality presents us and design systems which rely on human adaptability and strength. Often, cost is given as reason for not doing the design work necessary, as training the human is deemed cheaper. But, cheaper at what point in the system life cycle? Short term maybe, long-term do we or can we honestly ask the question this way? Integrity demands recognition that new terms for old views is like fresh paint on a mausoleum, the inside is still filled with bones of the dead.

    1. William R. Corcoran, PhD, PE Reply


      Thanks for bringing up Skinner.

      Would Skinner have noticed that neither pilot had an effective antecedent (cue, activator) for an evasive collision-avoiding maneuver?

      In the absence of an effective antecedent (cue, activator) is any behavior to be expected?

      Which Skinnerian reference would you suggest to the Admiral?

      1. Ron Butcher Reply

        As I said, Skinner got us pointed in the right direction. That being said, I, like many that have followed, recognize the shortfalls in Skinner’s hypotheses, particularly given the complexity of our modern world. I think Mr. Nelson (above) provided a far more eloquent description than I could. To directly answer your questions…

        Skinner was a bright guy so he likely would have noticed but he also would have been subject to the influences of his own confirmation bias to describe the incident as a lack of negative feedback in advance of the collision. Perhaps an alarm or other automated system failed (or was never designed and installed) to alert either pilot.

        I would expect both pilots to behave as trained.

        I likely wouldn’t suggest anything to the Admiral as I believe the Admiral is likely behaving within the realm of his perceptions, influences, goals, training, etc.

    1. Ron Butcher Reply

      I do recall the Iowa explosion and investigation. If I were to compare this one, I think the initial roll-out of the F-16 or the challenges of the swept wing control design in the F-111 are probably more applicable. For the Admiral, he has to protect the effectiveness of a very expensive weapons system. In this case, that system includes the pilots who, by nature of the work they do, have to have an inherent confidence in the remaining components of the system. I’m confident that from the perspective of the Admiral, he wasn’t looking at the loss of the two individuals as much as he was considering the potential loss of confidence and enhanced risks to the surviving pilots who flew that system. From a Human Factors perspective, those Organizational Factors all played a part in what is being categorized much further downstream as a loss of situational awareness. We also shouldn’t discount the Admiral’s bias as an experienced F/A 18 pilot in trying to explain the incident in the context of his experience and training.

      To Dr. Dekker’s point, we not have made it to 1947 yet. In many practices of safety, we’re still mired in 1931 and the thoughts of Mr. Heinrich.

  4. William R. Corcoran, PhD, PE Reply

    I’ve known many naval aviators. I don’t know any who would have been fooled by Admiral Shoemaker’s human sacrifice tactic of blaming the victim.

    Do you have links to good material on the F-16 and F-111 episodes?

    I also note that Situational Awareness is, at best, a fragile barrier.

    1. Ron Butcher Reply

      Please don’t think I agree with the Admiral. What I was trying to convey was what was likely his programmatic attempt at serving what he believed to be the greater good. It could probably be said that the Admiral too lacked sufficient “situational awareness” for his operational conditions.

  5. Marco Citterio Reply

    I found some analogies with the accident happened in Ascoli (Italy) on August 19, 2014. Two Italian Tornado fighters collided during a training mission. 4 pilots were killed. And the verdict was??
    Human Error!!!!
    (it seems a cover of a book, isn’t it?)
    Bye Sidney, see you soon!

    Some hints here:

  6. Martin Harding Reply

    My question relates to Human Error Investigations.

    I wish to discuss an actual ‘non-complex’ (eg not an air crash !) incident and how The New View would investigate it.

    A worker decided to use a concrete borer on a footpath outside a shopping precinct in order to create a hole that a pole for a sign could be put into.

    The concrete borer did its work and the concrete core was extracted.

    It wasn’t till later that the worker looked at the extracted concrete core and saw that it contained a section of electrical cable which powered a nearby set of traffic lights !! The concrete corer had cut through the cable.

    The simple answer could well be: Worker to do a Dial Before You Dig before doing any Concrete Coring.

    Sidney Dekker – how would approach this “Human Error Investigation” ?

        1. drbillcorcoran Reply

          • Barrier Insufficiency

          An inescapable fact is that whenever harm occurs it is certain that there were no effective barriers to protect the item harmed from the hazards that resulted in the harm as it occurred. For advocates and aficionados of the Swiss Cheese Model , an inescapable fact is that when harm occurs every slice of cheese either had a crucial hole in it or the slice did not exist. Corrective/ preventative actions, when effective, involve improvements in barriers and/or measures to reduce reliance on inherently weak barriers.

          Observation: If any slice of cheese had been American, not Swiss, the accident would not have happened.

          “It was as if the pitcher kicked the soft bunt past the shortstop to guarantee a triple.”-A hand-wringing manager

          • Insufficient Transparency

          An inescapable fact is that harmful conditions were not discovered earlier because they were not sufficiently transparent at the times they were not discovered . Harmful conditions were not discovered earlier because they were not sufficiently transparent to any of the people who missed the opportunities for discovering them .

          Observation: Transparency makes it easy to see what’s wrong.

          Observation: Transparency makes it hard to conceal what’s wrong.

          “Transparency is the best deodorant.”-Unknown (for now)

          Observation: Those who keep their cards too close to the vest forget what is in their hand.

          Observation: Transparency is the mortal enemy of deception, fraud, waste, incompetence, cronyism, wrongdoing, and sometimes even stupidity.

Leave a Reply

Your email address will not be published. Required fields are marked *