Why expert says mistake proof systems need a cultural component
Most major accidents in the offshore energy industry are often the result of human error, according to Ola Olamilehin, senior process safety engineer with Cenovus Energy. “Over 70% of the incidents were due to a lot of human errors, and that's not just one human error, but multiple of them at the same time…a compounded domino effect.”
Olamilehin gave a presentation at the Energy Safety Conference in Banff, Alberta about managing major accident hazards. He focused on abnormalities in the risks and conditions that contributed to major accidents causing the deaths of at least three workers.
“I would say the industry still has some ways to go,” says Olamilehin, when it comes to preparing for “those things that can happen and hurt or kill people.” He says abnormal conditions contained within the safety processes and systems will continue to show up, “but the most important thing is what do we do about them, and that’s cultural.”
Mistake proof systems
Olamilehin spent much of his highly technical presentation focusing on developing mistake proof systems using the bow tie method and diagrams, which provide a visual depiction of scenarios leading to a hazardous event. But what happens when a system that is believed to be mistake proof leads to complacency?
That’s where safety critical element coordinators come in, according to Olamilehin, who says it is their role to make sure the system is operating accordingly and identify abnormalities. “It's no longer just managing the numbers, it’s no longer about having a great robust system, so I can relax now…it’s not just about the system alone.”
Cultural components
Complacency is cultural, and Olamilehin says companies must have a strong set of values at every level of the organization to prevent complacency from seeping into the culture. He says that’s how a chain of mistakes can lead to a major accident. “It's understanding what is happening, versus just staying in that state of a limbo where you just relax and wait for things to happen before you begin to react.”
Vigilantly searching for those abnormalities within a mistake proof system, so they can be identified and acted upon to prevent or mitigate the hazard begins with culture, according to Olamilehin.
“The culture is very key. It can make or break the entire organization,” says Olamilehin, “so driving that operational discipline at the lowest level all the way to highest level is key.”
You can have the most robust safety systems, but without that cultural component that encourages maintaining a mindful eye for abnormalities, the risk of a major accident will never be truly mitigated.