HSE&T Corner: Avoiding complacency through chronic unease can help ensure process safety
By Linda Hsieh, managing editor
Dr Ronald McLeod, global discipline lead for human factors at Shell, wants to help the drilling industry tap into “chronic unease.” Being chronically uneasy means never trusting what you see – on instrument readings or control panels – and what you hear – the sounds coming from pipes or various rig equipment. It makes you think twice about everything around you and can help direct focus to potential problems.
When applied to drilling and well operations, it can be effective at reducing fatalities and other incidents – including process safety incidents with catastrophic potential.
In essence, chronic unease is the opposite of complacency. “It’s constantly being weighed,” Dr McLeod said in a presentation at the 2013 IADC Drilling HSE&T Europe Conference, 25-26 September in Amsterdam. Shell has been undertaking significant effort around chronic unease based on the work of psychologist Dan Kahneman. In his 2001 book “Thinking Fast and Slow,” Dr Kahneman simplified how the brain works into two processes: System One and System Two.
“System One thinking is very fast and intuitive, whereas System Two is slow and deliberate. System One is always on. It doesn’t need any effort, whereas System Two you have to actually apply it because it’s lazy. You have to go to the effort of thinking,” Dr McLeod said.
The goal around chronic unease, therefore, is to ensure critical judgments and decisions are not overly biased by System One thinking. In a Shell-produced video shown during Dr McLeod’s presentation, the concept of chronic unease was presented as a key element of the company’s journey toward Goal Zero. Goal Zero is Shell’s belief that it can operate without fatalities or significant incidents.
“The fact that an incident hasn’t happened doesn’t mean that it won’t. Every incident-free day that passes might increase the chance of one happening. Why? Because we can get complacent,” the video stated. “We stop being uneasy. We stop looking for weak danger signals. We stop thinking, ‘What else can we do to ensure safety?’ We all do it – we all think ‘It won’t happen here, now, to me.’ But it can, and that’s why it’s so important to demonstrate your chronic unease.”
Within Shell’s efforts also comes the concept of cognitive bias, or thought processes that convince you nothing is wrong. There are numerous types of cognitive biases – “group think,” confirmation bias, looking without seeing, etc. “They all lead us to more risky decisions,” Dr McLeod said.
He encouraged companies to think about cognitive bias within the context of drilling automation and supervisory control. The industry continues to increase automation levels across its well construction operations, but Dr McLeod warns that supervisory control can be a psychologically complex task for humans, and that should be considered as we automate drilling. He cited an example from the commercial aviation industry. In June 2009, Air France flight 447 crashed into the Atlantic while en route from Rio de Janeiro to Paris. More than 200 people died.
From a technical perspective, it was a failure in the automated flight system that led to loss of air speed. “That alone, though, shouldn’t have caused the airplane to crash. Pilots were trained for this – they’d been through training months before, how to handle that situation,” he said.
The pilots and crews were unexpectedly handed manual control, but they didn’t carry out the emergency drill for which they had been trained. “From the moment they lost air speed until they hit the water, they had four minutes, 23 seconds… They appeared to be unaware of what flight mode the aircraft was,” he said.
Dr McLeod stressed that industry must recognize and address the challenges associated with supervisory control. “It seems to me, the more automation we add, it’s adding significant cognitive complexity to the people that are left here. They’ve got to maintain real-time awareness of what state the automation is – what it’s doing and what state of process it is in, in case you have to gain control. They have to maintain the skills they need to be able to intervene.”
Questions should be asked as industry moves toward further automation: Are we aware of the cognitive complexity of all we’re doing as we increase automation? Do we adequately train people to detect and diagnose critical abnormalities under real-life operational stress?
Design of automated drilling systems should aim to make it easy for the operators to know what their equipment is doing, to identify when it is not working as expected and to quickly and safely take control in the event of a failure. “I think there is a need for greater awareness within the industry of the cognitive demands of the stuff that’s being introduced,” Dr McLeod said.
This article is based on a presentation at the 2013 IADC Drilling HSE&T Europe Conference & Exhibition, 25-26 September, Amsterdam.