What can health & safety professionals learn from a jet aircraft designer?
Looking out the window of my Bangkok hotel room at the building site over the road, I started thinking about the differences in health & safety standards between countries like Thailand and the one’s I’m more familiar with - like Australia & New Zealand.
In particular, it got me thinking about risk & hazard management and what countries like Thailand can learn from the West.
But it’s not as straightforward as you might think. Simply concluding that ‘West is best’ totally misses the point.
In fact some of the hazard & risk management systems I’ve seen in the West would do more harm than good in this environment.
Truth is, whilst there’s no denying that they’re thorough and substantial, in many cases they’re simply not practical. The problem is they’ve been overthought and over-engineered - even for the western environments they were designed for. In a less mature market like this one they’d be a complete disaster.
But why is it that such over-engineering takes place in the first place?
I like to refer to it as our tendency to complexity.
Complexity is all too commonplace in our industry because many people crave it. They love the idea that, because it is difficult to comprehend, complexity is in some way intelligent.
But deep down, we all know that even complex subjects are best understood through simple concepts – and therefore solved with simple solutions.
What I have seen over the years is that there exists an inverse relationship between the complexity of an organization’s risk management system and its usefulness. The best systems are practical systems – they are simple and easy-to-understand; whereas the worst systems are complex and doomed to fail.
The bottom line is that many risk management systems would benefit from a strong dose of the KISS principle.
Keep It Simple, Stupid (KISS)
KISS – short for Keep It Simple, Stupid - was first coined in the 50’s by an engineer at Lockheed Martin called Kelly Johnson.
The principle is best explained by the story of Johnson challenging a team of design engineers that the aircraft they were designing had to be not only fit for purpose, but also fit for its environment and its people.
The key message he was trying to get across was that there was no point them creating an aircraft so complex that it couldn’t be understood and fixed by an average mechanic working under combat conditions and with only minimal tools.
Hence, the 'stupid' term refers to the relationship between what happens in reality and the sophistication available to understand it.
And that’s the issue with many risk management systems – they don’t align the way things break with the (lack of) sophistication available to fix them.
There’s just no point creating something that is so complex that it can’t be understood and used by an average individual working in their typical environment.
Fundamentally, hazard & risk management needs to protect the workers. That means it needs to be designed to work in the real world, i.e. it needs to be immediately useful and practical.
Construction workers and labourers know the hazards and they know they need to be controlled. But they don’t care about doing a 3-page analysis of the hazard. They already know what the controls are.
And for the workers over the road, what they really need is some better personal protective equipment (PPE) – not some abstract and overly-complicated system that adds zero value to those it’s designed to protect.
The Takeaway
There is a clear choice to be made when designing risk management systems. They can either be an academic exercise designed to be intellectually stimulating or they can be emphatically useful. But they can’t be both.
Which one are you going to choose?
Click here to find out more about Mango's Risk Management module.
Alternatively, contact us direct right here.