Human Factors in the Oil and Gas Industry – Decision Making Part 1

Gareth Lock

This is the second in a series of articles written for Oil and Gas IQ highlighting the importance of Human Factors and Non-Technical Skills in the Oil and Gas sector, and focuses on the way in which decisions are made and why they are prone to errors, especially in high tempo dynamic operations, or when situations are encountered which haven’t been thought of or previously briefed.

Decision-making is something we do every day, ranging from simple situations where we don’t even know we have made a decision, to the work environment when complicated plans are scrutinised and decisions on the best way forward are made after much deliberation.

However, no matter what decision we make, they are all heavily influenced by biases, biases which are almost hard-wired into our own psyche or culture and are therefore transparent to our conscious mind. We are unable to stop those instantaneous decisions, but we can recognise when they could be an issue, stop, pause and think about it, then apply the (correct) decision and move on.

If we take a common experience, you are in your car approaching a junction you are about to turn into. You don’t consciously thinking about the following: looking in the mirror, signalling, applying the brake, changing gear (if you are in a manual), turning the steering wheel, checking the exit path from the turn, noticing a pedestrian on the curb so you now pay attention to their movements, straighten up the wheels, cancel the turn signal, slowly accelerate away and change gear as you do so. Most of this is automatic and does not require conscious thought.

What about if you knock an object from the worktop to the floor, do you immediately try to catch it? Do you have the same thought process if it is a knife that you knocked off (but didn’t see what it was)? The natural reaction is to try and catch what you have knocked off, but if you haven’t seen the knife and reach out to grab it, then you may risk injuring yourself.

Daniel Kanheman described these decisions, and similar ones, as System 1 thinking decisions in his book ‘Thinking: Fast and Slow’. These decisions are intuitive, rapid, without conscious thought and rely on your previous experience to provide an instantaneous response. Your brain does this because it heavily relies on pattern matching to reduce the time taken to live life. If we had to stop and think about everything we did, it would take almost forever to get complicated tasks done and we would not be able to live or operate at the pace we do.

Recently New Scientist magazine published an article about pilots making incorrect decisions in a simulator when they had to execute an emergency drill when it occurred in a different place in the simulated sortie to the majority of other simulator exercises (e.g. engine failure during the first take-off in the simulator compared to take-offs later in the exercise). Such incorrect decision-making may have lead to the Tawain Airlines crash on 4 February 2015 where the wrong engine was shut down due to the fact that a large number of simulator sorties focus on the ‘critical engine’ failing rather than evenly spreading the failures. Pattern-matching can have very negative consequences if the information we have about the environment is not correct.

Such issues can exist in the Oil and Gas domain too. What about when controls are non-intuitive? e.g. a manifold choke control labelled Left Choke on the right-hand handle, Right Choke on the left. In an emergency, what is the most likely response by the operator, reading a label or intuitively going for the left or right-hand placed control?

Or consider when a crane operator has moved containers from one side of the deck to another tens of times and it always fits and therefore they don’t check that this one they are lifting will fit? On this particular occasion they don’t notice that there is something else on the deck and his banksman hasn’t told him either. He probably won’t find out until the load is close to the landing point and the task is delayed. There is an assumption that things don’t change, and if they do, we’ll notice. Unfortunately this is a flawed assumption.

Humans are very good at this process of pattern matching and it has allowed us to save countless hours in tasks and undertake activities in highly dynamic environments. We constantly trade between efficiency and thoroughness, depending on what the situation is and how much risk is involved. This includes the crew operating on the deck or drill floor ‘taking shortcuts’. (I’ll cover the implications of shortcuts in a later article but suffice to say there are obvious potential issues but they aren’t all bad as long as they are recognised and managed.)

Consider the following text, many versions of which are found across the internet.

It is azmanig waht the hmaun biran is clbpaae of wtih olny letiimd dtaa. As lnog as the frsit and lsat ltetesr are crorcet the oderr of the ohetr lteerts is ilevrnaret for untsnadireng and rediang the txet; the biran can awyals regconsie the wdros by unisg ptatren rgenotiiocn anolg wtih coxtent.

It isn’t really amazing, it is just pattern matching. As we develop in life, we don’t read every word, we start to read the context of what is being written and fill in the gaps. Small children have difficulty in successfully completing this task because they don’t have the memories with which to compare the context and the text. Surprisingly, the majority of non-native English speakers who I have trained still get this message when I show it to them.

However, such pattern matching can lead to errors when the pattern is not quite what we expect, or we get distracted. Consider when you are reading drilling instructions, do you read every word and every number that is in there? How much do you skip read? What about loading instructions for support vessels? Can you remember when someone else has picked up something you have missed in terms of written instructions?

The majority of the decisions we make are instantaneous and are made with reference to previous experience. We use these memories to fill the gaps in our immediate awareness, and most of the time it is correct. However, when there are critical decisions to be made, or the task is novel, it is worth pausing, thinking through what the task is and how it is made up, and most importantly, what are the implications of making a wrong decision. We cannot rid ourselves of System 1 thinking, but we can use tools such as checklists, procedures or structured briefings to reduce the likelihood of a major incident.

The next article will also be about decision-making, but this time focusing on the different types of decisions we make e.g. knowledge-based decisions or recognition-primed decision making, and why we need to understand what sort they are as they have different error rates, which has an obvious impact on safety and performance.

About the Author

Gareth Lock is a retired senior officer in the RAF where he operated, instructed and supervised on tactical and strategic missions in the C-130K Hercules.

During his time in the RAF he started a PhD examining the role of Human Factors in SCUBA diving which has technical, supervisory, organisational and cultural issues at its core. Since October 2014 Gareth has been delivering Well Operations Crew Resource Management (WOCRM) training and coaching in the Middle East for a major client working through Critical Team Performance, a bespoke consultancy who were contracted by the IOGP's Wells Expert Committee's Human Factors Task Force to prepare an industry-changing recommended practice document that would set a standard for the implementation of Crew Resource Management (aka Non-Technical Skills) in Well Operations Teams.

This programme has identified a number of significant challenges in delivering Western-based training into non-Western cultures and thus required considerable skill to get to the core problems about the barriers to improved performance and consequently improved safety.

Read more of Gareth's Oil & Gas IQ articles here