Let’s talk about success

spanner2Look at the man in the picture to the right. He is fastening a bolt using a spanner. To get more torque he has done what is common practice in industry – he has taken a piece of pipe to extend the handle, this is often called a cheater bar. Put differently, he has adapted the standard that was provided to him.

When everything is going well we’d like to believe that it is because organisations are in control – that projects are successful because people follow rules and standards, and that mechanical and technical performance is carefully held well within prescribed boundaries. In this worldview, there is little, if any, uncertainty or ambiguity in operations. Work is a straightforward execution of plans and designs that have been invented from within approved know-how. There should be no surprises, no accidents, there should be zero harm.

But work is not simply about rule following. Nor is it just about executing plans and designs. Work is about getting things done by bridging the gap between design and reality, between complicated and complex, between the static and the shifting, between the ordered and the messy. To paraphrase Brian Wynne: The generic concepts always need to be applied in a local context that is unique, diverse, changing, open, and connected to a myriad of goals and needs.

As plans make their way into applied settings, gaps and discontinuities always appear. There may not be enough tools around, instructions not clear enough, or materials show up late or ahead of schedule. The ideal world is rarely present. And there is rarely enough time to wait for the ideal to come around.

To make ends meet, people use what’s available. If the solutions and tools provided to them are not enough, or too cumbersome to use, they will come up with a solution that will make success more reliable. They will write things on displays. They may hide tools to ensure that the job can get done the next day, or they may put a pipe on a spanner.

Work practices brew and develop in overcoming these gaps. But it’s a gap in which things and solutions often are uncertain, negotiable, ambiguous or seemingly good enough. Is the spanner/pipe solution compliant? Is it approved? Is it safe? Is the spanner designed to cope with this? Would it be noticed in a job safety observation? Is it a safety issue at all? Whose issue is it? Is it an acceptable practice? What opinions change if the solution was somehow involved in an incident/injury?

Organisations may be fine with these things and prefer to not intervene, since the solution, uncertainty and ambiguity allows the job to get done. Or perhaps worse, because they assume that work is successful because workers simply follow procedures.

The cheater bar may be a brilliant solution, but the problem is that adaptations, innovations, practices, strategies and solutions that grow in the gap between plans and reality often develop in an uncontrolled and uninformed way. The rest of the organisation has not been given the chance to assess and evaluate how the solution fits in within a larger context. The innovation has not been given the opportunity to be exported, refined, or risk assessed. It remains an uninformed and uncontrolled resource.

At one stage when watching the man fastening the bolt, he put his back under the extended handle and pushed with his back while rising to an upright position. It looked rather uncomfortable. Nothing went wrong though. Looking for failures would not be a good strategy to capture this aspect of work.

Continued success requires an ongoing engagement with those who are left to finish the designs and plans. Safety professionals with their interest in people have the possibility to do just that – to be critical thinkers and a concerned voice where things are difficult, ambiguous and frustrating. To ask questions about what works well, about improvisations and solutions, and about what pisses people off. However, this requires that we start looking for other things than errors, deviations and failures. It requires that we start talking about what we do to achieve success.

Reference: Wynne, B. (1988). Unruly Technology: Practical rules, impractical discourses and public understanding. Social studies of science, 18, 147–167. 

8 Comments

  1. John Culvenor Reply

    Hi Daniel, Good story. Relevant ideas. Just something niggling at me; unless it is a left hand thread, if he pushed up with his back, wouldn’t that undo rather than fasten the nut?

  2. Steven Shorrock Reply

    Spot on Daniel. The same assumption is true with respect to safety management: that things are so safe *because of* safety management practices. Especially where people are involved, it doesn’t really hold. When you actually observe people at work, it is clear that continual adjustment to ever changing demand and conditions is what keeps things safe at an operational level. Of course, safety management systems also have an important role to play, along with engineering and training practices, but the causal link with safety is not simple or clear. The variability that we might seek to constrain today may be the variability that keeps things safe tomorrow, as the need and tolerance for variability also vary.

  3. Ben Kirkbride Reply

    Daniel,

    Another great post! The school of through in your post makes very good sense, especially in regard to the variances in risk perception. On one hand ‘Innovation’ in the eye of the beholder and on the other, ‘unsafe practice’ in the eye of the organisation. Just yesterday during a site induction, I asked the participants “did they adopt safe work practices outside of work”.
    From this point things become interesting as one individual explained his not so pleasant experience, therefore his desire to make ‘safety always’. On the flipside, another participant revealed that he ‘only adopted safe work practices whilst at work’. This was mainly due to the fact that he had not experienced or heard of any unwelcomed events. Just imagine the ‘adaptations, innovations, practices, strategies and solutions that grow’ outside the work environment however due to the complex, robust and technical safety management systems we have in the workplace, these are never introduced in fear of breaking a rule or procedure.

    My point here is that risk perception can stifle creativity and the ability for workers to adapt work practices. On the same token, organisations need to be able to adjust accordingly and provide a little more resilience when dealing with innovative solutions, especially in our complex socio-technical world we live in.

    The other interesting finding for me was that those individuals who have experienced ‘safety gone wrong’ are intrinsically safer than those that have not. How many people do you know that have an occupation in safety roles due to them either being injured at work or knowing of someone that has been injured at work?

    BK

  4. Les Henley Reply

    Hi Daniel, I agree with your post. (I was a fitter by trade in former life of many years ago).
    In this particular instance there are several significant risks that an uninformed or unskilled labourer may be exposed to. And that other parties may not be aware of if they observe the action.
    The spanner, and the bolt/nut, are designed and manufactured to a certain specification. That means the spanner’s length is designed to apply a maximum force to the bolt/nut. It looks like a ring spanner to me. That means the ‘ring’ is designed to prevent distortion up to a maximum pressure when the handle (lever) is applied. Exceed the design torque, the ring fails, and the spanner ‘let’s go’. (I’ve witnessed it happen).
    And the bolt/nut are designed to take a maximum torque when tensioning. Exceed the torque and the tensile strength of the bolt shaft is affected. (I’ve seen these beak too).
    In one paragraph you mention that “Nothing went wrong though.” But it may only be a matter of a number of times this is done before something goes wrong. (I did it this way last time and nothing went wrong can be a deadly risk assessment).
    Each time a ‘cheater bar’ is used, especially with the added force of the leg muscles (lifting with his back), there is a potential for the spanner to be ‘fatigued’ and this may eventually lead to a failure under pressure. Or the bolt/nut assembly may fail under the force being applied.
    Imagine if that failure had occurred while he was lifting with his back and he suddenly shot upright due to the loss of resistance. Potential for significant musculo-skeletal damage not to mention what the cheater bar and spanner might strike under such uncontrolled force.
    The spanner may not fail the first 10 or even 100 times but it may eventually fail due to repeated exposures.
    Or imagine if, by applying the extra force in tensioning the bolt/nut, it fails suddenly. There is a potential for one or more parts to become projectiles, not unlike bullets, due to the release of energy stored in the bolt shaft under tension.
    Or if it is fatigued and fails later when the structure is in operational use – and there is the potential for multiple bolts/nuts to fail at once if this process has been applied at all points.
    This type of adaptation has also been used in the transport industry, where a cheater bar has been used to tighten a chain dog (load tie-down mechanism), with fatal outcomes when either the dog or the chain fails and the then loosened chain has sprung back (like a wire cable can) and struck the operator or a bystander.
    So whilst I applaud the initiative of individuals looking to overcome barriers to productivity, there is still a need for “information, instruction, training and supervision” in the application of such initiative.

  5. Mike Edwards Reply

    Hi guys,

    I can only hope that the people preparing the next plane I have to fly in have a reasonable appreciation of ‘ the risks’ and the possible ‘bad’ consequences that could result if they get it wrong Daniel. I hope this impacts on the amount of care or control the operators apply to doing their task.

    I realise it might be old hat, but attention to detail and a focus on closure make the difference in the end to the success or otherwise of the task in minimizing the possibility of those unacceptable negative outcomes.

    Short cuts are just that and almost always create a heightened level of risk of unexpected or negative outcomes – otherwise they would become the accepted method for doing the task, not just the way it’s done when they think no-one is looking.

    I like to think that we all try to control the work process enough to reduce the really obvious serious negative consequences at a minimum. Otherwise perhaps we should just role a dice or toss a coin to decide what to do (circa the Big Bang episode when Sheldor decides to let the dice control his fate).

    I have observed plenty of work where the only muscle not effectively engaged seems to be the grey matter. Are we seeing – individual innovation or just laziness, poor planning and inadequate preparation. After all, we all know “Don’t worry. She’ll be right, mate” is still alive and well in most industry sectors I interface with.
    Mike

  6. Tanya Hewitt Reply

    I was thinking as I was reading this post and the many replies that the local rationality principle is at work here – that actions taken be front line practitioners make sense to them at the time they are making them (else they wouldn’t be performing the acts). If the act is not in line with what is expected, the reaction should not be to tell the worker what is expected, the reaction should be to understand why the worker is making the choices he/she is.

    In Reason’s trips, slips, lapses and violations, it is the violations category that interests me the most – the workers often do know what the “right” way to do something is, but have good reason to engage in what they are doing. (Hollnagel’s Efficiency Thoroughness Trade Off, or ETTO principle, is often realized here.)

    In fact, in a study that I cannot find, undertaken by Hollangel and Amalberti (Hollnagel, E. & Amalberti, R. (2001). The emperor’s new clothes: Or whatever happened to “human error”? In S. W. A. Dekker (Ed.), Proceedings of the 4th International Workshop on Human Error, Safety and Systems Development. (pp. 1-18). Linköping Sweden: Linköping University), air traffic controllers and human factors/psychology specialists observed a set of air traffic controllers, and determined when the ATCs were involved in rule breaking. Fascinatingly, when the observed ATCs were allowed to see where their performance was not as it should have been, they were able to justify why they had engaged in the activity they did. They all had good reason to do what they did, and could explain so.

    I learned of the above paragraph from a paper that Sydney Dekker wrote a while back, called “Doctors are more dangerous than gun owners”, still available at http://www4.lu.se/o.o.i.s/6131

Leave a Reply to Daniel Hummerdal Cancel reply

Your email address will not be published. Required fields are marked *