Look at the man in the picture to the right. He is fastening a bolt using a spanner. To get more torque he has done what is common practice in industry – he has taken a piece of pipe to extend the handle, this is often called a cheater bar. Put differently, he has adapted the standard that was provided to him.
When everything is going well we’d like to believe that it is because organisations are in control – that projects are successful because people follow rules and standards, and that mechanical and technical performance is carefully held well within prescribed boundaries. In this worldview, there is little, if any, uncertainty or ambiguity in operations. Work is a straightforward execution of plans and designs that have been invented from within approved know-how. There should be no surprises, no accidents, there should be zero harm.
But work is not simply about rule following. Nor is it just about executing plans and designs. Work is about getting things done by bridging the gap between design and reality, between complicated and complex, between the static and the shifting, between the ordered and the messy. To paraphrase Brian Wynne: The generic concepts always need to be applied in a local context that is unique, diverse, changing, open, and connected to a myriad of goals and needs.
As plans make their way into applied settings, gaps and discontinuities always appear. There may not be enough tools around, instructions not clear enough, or materials show up late or ahead of schedule. The ideal world is rarely present. And there is rarely enough time to wait for the ideal to come around.
To make ends meet, people use what’s available. If the solutions and tools provided to them are not enough, or too cumbersome to use, they will come up with a solution that will make success more reliable. They will write things on displays. They may hide tools to ensure that the job can get done the next day, or they may put a pipe on a spanner.
Work practices brew and develop in overcoming these gaps. But it’s a gap in which things and solutions often are uncertain, negotiable, ambiguous or seemingly good enough. Is the spanner/pipe solution compliant? Is it approved? Is it safe? Is the spanner designed to cope with this? Would it be noticed in a job safety observation? Is it a safety issue at all? Whose issue is it? Is it an acceptable practice? What opinions change if the solution was somehow involved in an incident/injury?
Organisations may be fine with these things and prefer to not intervene, since the solution, uncertainty and ambiguity allows the job to get done. Or perhaps worse, because they assume that work is successful because workers simply follow procedures.
The cheater bar may be a brilliant solution, but the problem is that adaptations, innovations, practices, strategies and solutions that grow in the gap between plans and reality often develop in an uncontrolled and uninformed way. The rest of the organisation has not been given the chance to assess and evaluate how the solution fits in within a larger context. The innovation has not been given the opportunity to be exported, refined, or risk assessed. It remains an uninformed and uncontrolled resource.
At one stage when watching the man fastening the bolt, he put his back under the extended handle and pushed with his back while rising to an upright position. It looked rather uncomfortable. Nothing went wrong though. Looking for failures would not be a good strategy to capture this aspect of work.
Continued success requires an ongoing engagement with those who are left to finish the designs and plans. Safety professionals with their interest in people have the possibility to do just that – to be critical thinkers and a concerned voice where things are difficult, ambiguous and frustrating. To ask questions about what works well, about improvisations and solutions, and about what pisses people off. However, this requires that we start looking for other things than errors, deviations and failures. It requires that we start talking about what we do to achieve success.
Reference: Wynne, B. (1988). Unruly Technology: Practical rules, impractical discourses and public understanding. Social studies of science, 18, 147–167.