Recently I was at a chemical plant doing an assessment of the management system and “culture” of the site. The corporate safety group was alarmed by a spike in injuries at the plant over the last year and wanted an outside opinion as to what was wrong that was leading to all these injuries. This mindset is common. After all, failure must be caused by bad inputs – i.e. all these injuries mean that something must be wrong. In this case the assumption was that the “culture” (their word) of the site must be broken to cause the employees to want to take the risks that they took that led to the injuries. So, our job was to go out there, figure out what’s wrong, and make recommendations to fix it.
As I continue my exploration deeper into the “safety differently” line of thinking, I find myself becoming more and more aware of how often we assume a logical analysis when we’re really engaging in satisficing. Take for example a discussion I had recently with a colleague about a client where they had a problem employee who just wouldn’t wear his safety glasses while operating a piece of equipment. Other employees didn’t seem to have that problem, therefore the problem must lie with the employee. However, this line of thinking ignores other possible explanations. For example, it assumes that we have perfect knowledge of all of the behavior of all employees. But what if what we’re really seeing isn’t only one employee’s inability to wear his safety glasses, but rather the other employees’ ability to hide their behavior from us? Or what if we’re seeing the grit of the other employees who are forcing themselves to wear the safety glasses in spite of the problems it creates for them, whereas the other employee (the problem employee) just can’t take it?
There are other explanations possible, but it amazes me sometimes how quick we are to judge others’ behavior and use it to explain who they are without considering what the world looks like from their perspective. I mean, all we saw was that one employee wouldn’t wear his glasses and we’re ready to use it as evidence that this employee is lazy, stubborn, arrogant, and should be fired. All this without a thorough analysis. All this without knowing what the world looks like from the perspective of that employee.
Back to the chemical plant – so we’re walking through the facility, escorted by the safety professionals at the site, and we happened upon the plant manager. We engaged him in discussion, asking about what he’s done different to respond to the spike in accidents. What he told us was a bit shocking. He required each member of his management team to spend two entire shifts (two different shifts for each manager) with an operator in the plant, going where they go, doing what they do. They called it “a day in the life of an operator.” Out of that the managers came up with a long list of action items with the purpose of fixing the problems that the operators faced on a daily basis. Further more, as subsequent discussions with the managers highlighted, some of the best gains from the process were how managers and line employees got to know each other on a more personal level as a result. The relationships facilitated problem solving, as employees now feel more comfortable going to managers with issues and the managers feel more responsible for fixing those issues.
And you know what was missing from the discussion entirely? Blame. Throughout our stay there was very little of the normal blame-game that you see in these sorts of assessments. Almost unanimously people reported a sense of teamwork, a sense that employees at all levels care for each other. When problems were discussed the focus naturally was on external, environmental or system causal factors, rather than personal issues. This not only is often a more ethical perspective, it is also more pragmatic. Focusing on problems at the individual level often leads to safety whack-a-mole. However, as readers of this site are aware, the problem is almost always a system problem. By taking the focus off of the individual the site was naturally forced to look for those contextual features that led to the issues that they wanted to fix. This leads to overall better, more sustainable solutions to problems.
Now, doing something similar to “A Day In The Life Of” won’t occur to most managers. We’re used to fitting into certain scripts of organizational life, such as the one where workers do the work and managers do the managing, and each group complains about the other. Changing that script by one group doing the work of another is sometimes a violation (sometimes literally in some environments). Unfortunately these scripts are self-sustaining. The script creates a divide, which in turn leads to ignorance of the context of each party’s actions, which leads to more simplistic explanations of behavior based on faults (see the Fundamental Attribution Error), which leads to a greater divide. When we see the employee not wearing his safety glasses we more reflexively attribute it to a flaw inside the employee (personality, character, etc.). But the employees know that there’s more to the story that you’re not attempting to see. So they feel like it’s a witch hunt and mistrust ensues.
The only way to stop a self-reinforcing cycle is to choose to step out of it. As asked by Pete Blaber – how would we organize if we did not know how we were supposed to organize? To put it into our context – how would we practice safety if we did not know how we were supposed to practice safety? Elsewhere in his book, The Mission, The Men, and Me, he calls this line of thinking “the art of the possible.” By engaging in the art of the possible and finding ways to get out and see things from a different perspective, from the perspective of those dealing with the messiness of normal, everyday work, we won’t eliminate satisficing and the Fundamental Attribution Error. But we may blunt their effects as we will be more likely to find alternate explanations for behavior that facilitate relationship building and problem solving. After all, if that one employee who isn’t wearing his safety glasses is someone you’ve built a relationship with, you’re more inclined to go inquire as to why and to seek external explanations (i.e. contextual) for their behavior. You engage in conversation (leading to more trust), which in turn leads to a richer understanding of potential solutions.
The Fundamental Attribution Error might be involved.
The Wikipedia entry is pretty good.
All the best,
The posting made me think back to the 1980s when I looked after line crews in an electric power organization. At that time, Management by Walking Around (MBWA) was the latest and greatest being promoted by the book “In Search of Excellence.” So, like good sheep, off we went. Besides wandering around, I did as per the chemical plant managers and spent a whole day with a crew mainly listening and asking questions. Sure enough, once a rapport was established, the problems and concerns the guys were facing on a daily basis quickly outpoured. The guys certainly pointed the way for me and help set fixing priorities. But as I look back, I now know that I would do it differently. Safety differently.
I would concentrate more on what crews and individuals did to make things go right, i.e., Safety-II. I’d be noting down workarounds that closed the gap between “work-as-imagined” and “work-as-actual”. I would use the earned trust to ask what they did when faced with a safety dilemma (e.g., when safety rule “A” conflicted with rule “B”, what happened?). If not wearing safety glasses was a perceived issue, I’d ask people to tell me a story when safety glasses made a situation safer or more dangerous. The collected stories would give me a better sense of the safety culture, i.e., how work really gets done when the boss isn’t around.
I would use the earned trust to ask what they did when faced with a safety dilemma (e.g., when safety rule “A” conflicted with rule “B”, what happened?)
This is one of the situations that force workers into Knowledge-based Behavior.
Knowledge-based Behavior has a reputation for being in error about half the time!
Skill-based, Rule-based, Knowledge-based.
A recognized and generally accepted good operational practice (RAGAGOP) is:
Do not force workers into knowledge-based behavior!
What knowledge-based behavior fiasco is the poster child for this one?
Macondo pressure test results interpretation?
Three Mile Island interpretation of increasing pressurizer level?
Hi, William. The prime reason for asking the prompt question is to learn where rule conflicts exist. Once identified, then one could analyze and determine if the solution taken was correct. The second reason is monitoring novel situations, an event that has never occurred before. Perhaps it’s an unknowable or unimaginable which has emerged and placed the crew into a predicament of conflicting rules.
I’m curious about your comment re do not force workers into knowledge-based behavior. What are your views on mentoring, Master/Apprentice programs? I do have concerns if “old salts” are passing down traditional practices or workarounds that are fallacies and myths.