2025November/DecemberSafety and ESG

Training for abnormal operations can help reduce human errors that contribute to safety events

To support procedural discipline on the frontline, organizations must look to situational awareness, performance-influencing factors

Speaking at the IADC Well Control Conference of the Americas on 19 August, ModuSpec’s Director of Technology Innovation Michael Fry outlined several strategies that can help organizations counteract the risk of human error.

By Jessica Whiteside, Contributor

“Why do you refuse to follow procedures?”

This provocative question is one that Michael Fry sometimes poses when conducting training sessions with drilling organizations on topics such as procedural discipline, leadership and quality assurance.

The typical response – and natural reaction, he said – is, “How dare you say that! We all follow procedures.”

Mr Fry, the Director of Technology Innovation for ModuSpec, doesn’t buy that.

“We say we do, and we might to some level, but do we truly follow procedures 100% of the time? The answer is no,” he said during a presentation at the 2025 IADC Well Control Conference of the Americas in New Orleans, La., on 19 August.

He cited a review of well control incidents by the International Association of Oil & Gas Producers finding that human factors contributed to almost three-quarters of the incidents studied.

Mr Fry challenged drilling contractors to get down to a granular level during incident investigations to determine what caused someone to deviate from procedure – whether it relates to bias, psychological factors, competency, complacency or some other factor. The answer can lead to an “awakening” that helps people understand why they don’t always follow procedures, he said.

“You can be extremely safe and still make a mistake,” he said. “You can have all of the policies and procedures and JSAs (job safety analyses) and risk assessments, and then somebody loses situational awareness, and something happens.”

Support the isolated frontline worker

Frontline workers are often “isolated on an island” in situations where they must make decisions by themselves, whether as individuals or teams, Mr Fry said.

“We like to think that we’re there looking over their shoulder, giving them enough guidance and focus on what they need to do, but then there are things like performance-influencing factors that set in and cause us to think differently, to act differently, to go against our policies and procedures. So how do we get this individual – who’s isolated on an island trying to do his best each and every day – to eliminate human error?”

He outlined four interconnected strategies for counteracting the risk of human error: procedural discipline, training for abnormal operations, situational awareness training and eliminating performance-influencing factors.

To support procedural discipline, Mr Fry advises organizations to identify which steps in a procedure have the highest probability for human error, which errors are most likely to occur during those steps, and what can be done to prevent those errors. “But then the question I ask after that is: If we can do this for every procedure and every policy and every operation, why do we still have human error?” he said. “We know what we’re supposed to do. We know why we’re supposed to do it. Yet when that moment happens, we still have human error.”

Growing reliance on technology and less experienced teams may contribute to some of this disconnect. Understanding what to look for and how to react when something happens comes with experience. However, as technologies advance, that understanding among users starts to drop off because we’re relying on technology to tell us if there is a problem or what to do next, Mr Fry said. “But what if it doesn’t react in time?”

He suggested the industry pay more attention to training for abnormal operations to ensure that newer personnel recognize the first indications that something may be going wrong and understand when they can take corrective action or when they truly need to stop the job.

With frontline work often being performed by third-party hands, trainees or junior staff, do those individuals understand the difference between normal and non-normal? “A lot of times the answer is no,” Mr Fry said. “From a risk perspective, passing down operational responsibilities to someone who’s newer – who might not understand what’s normal or not normal – is where human error starts to take place.”

Situational awareness

Adherence to procedure may be stymied by lapses in situational awareness, even where risks and preventive measures have been identified. The feedback Mr Fry has received from crews is that something happens to take their mind off what they’re doing. They lose focus, so they may put a part in backwards or forget to install a seal, even when it’s a task they have done repeatedly in the past.

Mr Fry cited cognitive engineering authority Mica Endsley’s three-level model for situational awareness: (1) perception of elements in the current situation (e.g., observing and gathering data from the work environment); (2) comprehension of the current situation (e.g., connecting the dots from those observations to understand what they mean); and (3) projection of future status (e.g., predicting what might happen next based on your understanding of the situation).

Applied to a well control context, does the crew member perceive the significance of the data they see on their displays? If the data indicates an influx problem, do they understand the procedures they should undertake to address the issue, and can they predict the outcome of certain actions or inaction?

Mr Fry said he thinks the industry does not do a good enough job of training crews about these levels of situational awareness. Training should encompass what it is you’re looking for, why you’re looking for it, and what happens if you don’t do anything.

“If I’m drilling a formation, I expect the following things to happen – but what else could happen? What could be an indication that something’s started to go wrong?”

Performance-influencing factors

Breaches of situational awareness can be caused by performance-influencing factors, described by Mr Fry as something that occurs to change your thought process, how you go about your task, your reactions or your stress level. Fatigue can be an influencing factor, along with communication gaps, leadership decisions and a broad range of other factors. Among the biggest are time and money pressures, Mr Fry said, adding that these factors would could be minimized if frontline workers are properly trained and supported.

He described a scenario in which a driller delays closing the BOP during a potential influx, because the last time they followed procedure to do so as a precaution, they were chastised for incurring the associated costs. Another scenario sees a client balking at the intended duration of scheduled downtime, leading the rig team to question whether the task is important and second-guess why they wanted to do it in the first place. In a further scenario, a subsea supervisor decides to use old parts to rebuild a BOP because there are no spare parts onboard after the rig manager was told to cut back on the budget.

“It’s part of our sins, the things that we do as corporations,” Mr Fry said. “What’s the first thing that gets cut? Training, personnel, and spares: all the things that influence operations.” DC

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button