HUMAN FACTOR SAFETY LESSONS FROM POLICING APPLICABLE TO THE MARITIME INDUSTRY

May 25, 2015

HUMAN FACTOR SAFETY LESSONS FROM POLICING APPLICABLE TO THE MARITIME INDUSTRY
ATTRIBUTION: THIS PAPER WAS FIRST PRESENTED AT THE MASTER MARINERS CONFERENCE IN LAUNCESTON APRIL 15 2015

John Walker, BEd, MA, CMC, MNI, Leadr
John Walker began his career as a Wireless Officer with Alfred Holt and Company sailing on both Blue Funnel and Glen Line ships in the 1960’s. He entered the University of Calgary as a mature student graduating with a BEd, then an MA, from the University of Sussex. He is an Associate member of the CMMA, and a Companion member of the Nautical Institute.

He spent 10 years under contract to the Royal Canadian Mounted Police researching, designing and delivering a wide range of human factor operational programs. These included stress and trauma management, hostage negotiation and high level crisis intervention. He also has a unique breadth of behavioural science experience combined with research and learning design across many industries, including maritime.

INTRODUCTION

The main objective of this paper is to articulate the application of safety critical practices learned from policing operational safety which over the past decade have reduced injuries and fatalities significantly. This paper focuses on the application of three practical mental conditioning methods reinforced in police officer operational safety training. These are:

1. The awareness and avoidance of ‘The Ten Fatal Errors’,
2. The cognizance of personal ‘Myths and False Belief Constructs’ underlying operational safety, and
3. One specific situational awareness management methodology called “The Awareness Spectrum”.

Safety critical human factor lessons from policing have a direct application to maritime operations. The paper will describe, within a mental conditioning context, three fatal policing situations followed by an analysis of a serious maritime incident where fatalities occurred or could have occurred. These are:

• The Ten Fatal Errors: The loss of the Herald of Free Enterprise
• Myths and False Belief Constructs: The grounding of HMS Nottingham
• The Awareness Spectrum: The sinking of Queen of the North

Managing stress (fight and flight) arousal and perception distortion through applied mental conditioning techniques during complex vessel operations is crucial. It is also safety critical that watch keepers are aware of potential ‘Fatal Errors’, the consequences of their own personal ‘Myths and False Belief Constructs’ and an ability to apply ‘The Awareness Spectrum’ when appropriate.

1. THE 10 FATAL ERRORS

These tend to vary slightly between different police jurisdictions. The original list was published after a spate of police fatalities in the 1970’s which initiated a whole new paradigm in ‘officer safety’ training.

The fatal errors are identified in the following table:

Table 1: The 10 Fatal Errors
• Complacency, apathy
• Getting caught in a bad position
• Not perceiving danger signals
• Relaxing too soon
• False perceptions and assumptions
• Tombstone Courage (the John Wayne Syndrome)
• Fatigue and Stress
• Not enough rest
• Poor attitude
• Equipment not maintained

1.1 Policing Example: False assumptions and conditioning under stress in a high speed pursuit.

A fatal accident occurred when two police cars responded to an armed robbery. The police officer driving a marked highway patrol unit collided with an unmarked Investigation unit. He tragically assumed that when the unmarked police unit in front pulled over to the side of the road it was to allow him to pass. This would have occurred hundreds of time in normal traffic situations. The dominant ‘Myth’ construct would likely have been ‘vehicles in front always pull over when police emergency lights/siren are actuated’.

Updated information about the location of the armed robbery had been received from the radio room. The unmarked police Investigation unit in front was pulling over to execute a 360 turn. The driver assumed the officer driving the Highway Patrol marked unit behind had received the same radio message. Tragically they were on a different radio channel.

1.2 Maritime Example: False assumptions, not enough rest, poor attitude, relaxing too soon, and complacency as human factors in the capsizing of the Herald of Free Enterprise resulting in 193 deaths.
Most of her crew were on the return leg of the second Dover-Calais round trip during their 24 hour shift, suggesting that fatigue was a human factor in the accident. There were also design factors causing the necessity to flood the bow ballast tanks, lowering the bow by 3 feet. There were no indicators on the bridge to show whether the forward loading car deck doors were open or closed.
It was assumed that before leaving Zeebrugge, the assistant Bosun would close the forward car deck doors as it was his responsibility. He was still asleep when the harbour-stations call sounded and moorings were dropped. The Chief Officer was required to stay on deck to make sure the doors were closed and he assumed that after seeing a crew member in an orange coverall on the car deck, the doors had been closed. Compounding this, the Bosun did not close the doors himself as his attitude was that he didn’t see it as his duty at that time. The Chief Officer then left the deck and returned to the bridge where he was relieved by the Master who also now assumed the doors were closed. The Chief Officer was then directed by the Master to take his dinner break. In an astonishing lack of safety engineering there were no visual or electronic indicators on the bridge to show that the doors were closed. 193 passengers and crew perished in the disaster.[1]

2. MYTHS AND FALSE BELIEF CONSTRUCTS
When surviving police officers were interviewed after partners had been killed or injured, it became clear that belief constructs or ‘personal myths’ were significant contributing human factors.
Table 2: Common police myths
• It’s only routine
• There are two of us
• Nothing ever happens on a Sunday
• They are only kids
• I am armed
• It’s only a traffic violation
• It couldn’t happen here
• I can handle it
• I have back up
• It’s never happened before
• Women don’t fight
• Stress doesn’t affect me

Research into police officer fatalities and serious injuries indicated that they held certain beliefs or ‘Myths’ which reduced their alertness levels or appropriateness of response to an emerging danger. Interviews with injured officers and partners of police killed in the line of duty identified constructs such as “this could never happen here”, ” I can handle this on my own”, “I can handle it”, “I have back up. It is on its way”.
In the previous police example 1.1, the dominant ‘Myth’ expectation combined with the stress/excitement of responding to an armed robbery call would likely have been ‘It’s routine. Vehicles in front always pull over when police emergency lights/siren are seen or heard.’
After many years of patrol experience most police believe that they can categorize people and situations very accurately. Perspectives can become stereotyped, hardened and reinforced in their ‘Myths’ by commonly used language similar to: All ……..are ……. and ……always/never …… This can expose them to significant perception mistakes encoded in their belief system constructs which can have fatal consequences.

2.1 Policing Example: a fatal ‘Myth’ – women don’t rob banks

A police officer was responding to a bank alarm going off. It was 37 above in Los Angeles. He was observed shouting at a woman wearing a full length leather coat who was in close proximity to the bank. When she did not respond to his warnings he ran up to her shouting “please get into cover, there is an armed bank hold-up close by”. He was shot and killed by the woman who had just robbed the bank. She had a sawn off shotgun hidden under her coat. The officer’s likely ‘Myth’ construct was ‘women don’t rob banks’. He did not perceive the unusual heavy leather coat being worn on an extremely hot day as a danger signal. Compounding this, his situational awareness would have been affected by narrowed perception associated with the fight/flight reactions in responding tragically for all the right reasons.[2]

2.2 Maritime Example: The Grounding of HMS Nottingham.

The Royal Navy destroyer HMS Nottingham was severely damaged when grounded on Wolf Rock, east of Lord Howe Island on route to New Zealand on 7 July 2002. A combination of potentially ‘Fatal Errors’ likely included: complacency, apathy, getting caught in a bad position, and not perceiving danger signals. Likely belief constructs could have included “It couldn’t happen to me”, “I can handle it”, “It’s only routine”.

She was stabilised by her crew and towed back to Australia for basic repairs then taken back to the UK on a lift ship. The Nottingham was considered one of the most modern and capable ships in the Royal Navy at the time. The subsequent Inquiry as reported found that:

“It would appear that the Navigating Officer’s quality and standard of work in pilotage planning were also far from adequate. … The quality of the Navigating Officer’s chart preparations and notebook, and his execution of the manoeuvre out of the anchorage belie a casual approach to his duties, and a lack of understanding of risk”. He issued advice to the Officer of the Watch without any reference to the chart or knowledge of the ship’s position or the proximity of dangers. Specifically he inadvertently advised him to alter course directly towards Wolf Rock.[3]

These ‘Myths or False Beliefs’ are constructs that have evolved over time and are unstated thus rarely challenged, setting the scene for fatal errors to occur. Lessons from policing show that the most common fatal errors include: false assumptions, complacency, not perceiving danger signals, relaxing too soon, unmanaged fatigue and stress. I speculate that the Navigating Officer in the grounding of the HMS Nottingham may have had a strong sense of his own competence and the myth that there were no risks to the vessel as observed from the bridge. There is a need for watch keepers to challenge their own ‘Myths and False Beliefs’ before an incident occurs not after.

3. MANAGING SITUATIONAL AWARENESS, THE AWARENESS SPECTRUM

An effective situational awareness and perception management system known as the “Awareness Spectrum” identifies five levels of awareness and perception control originally articulated by Charles Remsburg in his book “The Tactical Edge, Surviving High Risk Patrol”[4]

The table of the Awareness Spectrum is shown on the next page.

Table 3: The Awareness Spectrum

White Situationally unaware, daydreaming, unfocussed, mind in neutral

Yellow Alert, observant but relaxed, scanning, observing, attentive to the situation, focus broad

Orange Potential threat, volatility, increased alertness, focus narrowing on threat area

Red Imminent high risk danger. life threatening, very narrow focus on source of the anger, hands, knife, gun, vehicle

Black Overwhelmed by fight/flight stress (Panic, Paralysis) visually overwhelmed, loss of focus and inability to make a decision

Mental conditioning became a key factor in police officer survival training to counteract extreme fight/flight stress and perception errors leading to loss of control and panic described as ‘Condition Black’. ‘Condition White’ is completely inappropriate for any bridge or engineering watch keeper at any time. As a potential incident develops, such as an approaching vessel apparently steering out of control, watch keeper’s levels of awareness, stress arousal and focus is adjusted appropriately never entering into ‘Condition Black’. Ideally the approaching vessel would be noticed in ‘Condition Yellow’ and as it closed in, the focus would narrow to ‘Condition Orange’. Appropriate avoidance action would be taken. If a collision was imminent, it would be a ‘Condition Red’ situation with actions being taken and orders given for collision avoidance. Bridge communication and vessel control is managed at all times. By managing perception and stress through this process, decisions, preparations and orders are made in a calm and methodical way.

3.1 Policing Example: Lack of Situational Awareness

Condition White to Black: A police officer was writing out overdue parking tickets in a suburban supermarket lot. According to a witness, a clearly distressed and aggressive individual approached the officer and started shouting. He ignored the situation and turned his back to the source of threat and continued to complete the ticket. Possibly in ‘Condition White’, used to abuse, he likely had a myth construct that ‘this is mundane, boring work; nothing ever happens on a Sunday here’.

Suddenly the distressed man pulled out a knife and moved very quickly towards the officer who turned around and panicked (fight/flight stress overload), dropped his ticket book and put his hands in the air. He did nothing to protect himself and was tragically stabbed to death. The witness reported that it all happened in seconds and reported that the officer appeared “blank/stunned” before being killed. Officer safety statistics show that a hostile person with an edged weapon can move 11 feet in a second.[5]

3.2 Maritime Example: The sinking of the Queen of the North

At 8pm on March 22nd 2006 the BC Ferries ‘Queen of the North’, a 37 year old RO/RO ferry of 8806 gross tonnes with a capacity of 700 passengers and 115 cars departed Prince Rupert, British Columbia on its regularly scheduled service to Port Hardy at the northern end of Vancouver Island. The normal crossing of the Inside Passage took 15 – 18 hours. The vessel failed to make a course correction and ran aground at 15.5 knots on Gil Island. Two passengers’ bodies were never recovered. Human error was found to be the central cause of the accident.
Three crew members were in charge of navigation and steering on the night of the sinking. The person at the wheel that night was the Ship’s Quartermaster (QM1), a female deckhand who was a “rating under training”. The two people in charge of navigation were the second (2/O) and fourth officers (4/O) .The internal report concludes that the ship’s black box shows the 4/O failed to alter course, or at the very least, verify such a change in course was made. It also concludes that the two people on the bridge that night, the 4/O and the QM1 lost situational awareness sometime after Sainty Point. The wind at this point was increasing and gusting in squalls to 30 knots with reduced visibility on the starboard bow.
The following observations regarding the human element were made in the Canadian Transportation Safety Board (TSB) indicating that human error was the principle cause of the sinking.
The (2/O) left the bridge but left behind a laptop computer that was playing music in the background. This was heard at Prince Rupert Traffic control when the 4/O communicated in advance a course change. Crucially this course correction was not activated. A personal conversation was talking place between the male 4/O and the female QM1 who was at the helm. The 4/O also turned down the dimmer switch on the ECS Monitor. They were now alone on the bridge and sat in their chairs next to the radar and forward steering station, respectively, and conversed intermittently for the next 12 minutes while music was being played in the background. The squall passed and visibility improved.

At about 0020, with the vessel now 13 minutes past the planned course-alteration point at Sainty Point, the 4/O moved between the bridge’s front window and the radar, and subsequently ordered a course change to 109º, which QM1 queried and he reaffirmed. As QM1 stood to make the change, she looked up and saw trees off the starboard bow. The 4/O also saw trees and moved to the aft steering station. As he did so, he ordered QM1 to switch from autopilot to hand-steering. QM1, however, was unfamiliar with the operation of the switch at the forward steering station and did not know how to comply.[6]

The Vancouver Sun newspaper reported that “Just before the crash, the 4/O screamed at the helmswoman to make a bold course correction–a 109-degree turn–and to switch off the autopilot. But the helmswoman responded that she did not know “where the switch was located.” The BC Ferries’ report questions the validity of this evidence “as the autopilot disengages simply with a single switch and would have been operated numerous times by the [helmswoman].” [7]

It is likely that at the time of the grounding on Gil Island both the 4/0 and the QM1 were in the Awareness Spectrum ‘Condition White’ ‘situationally unaware’. They were chatting, listening to music, the ECS screen was dimmed. The vessel had now made headway 13 minutes past the planned but not activated course-alteration point at Sainty Point as previously communicated to Prince Rupert Control. When QM1 saw the trees on Gil Island directly ahead and was subjected to the 4/0 screaming at her, she went immediately into a ‘Condition Black’ high stress fight/flight response losing her capacity to locate and switch off the auto pilot.

CONCLUSION
It is very clear that lessons from policing training and experience have application in the maritime industry and many other high risk occupations. Drawing from a decade in policing research and education this paper identifies and describes human factor practices that would augment BRM and other human element safety initiatives in maritime training procedures.
In policing, particularly high risk patrol, there are considerable officer safety risks encountered on a daily basis with significant consequences. Many police watches begin with a discussion of the ‘errors’ before they go on shift. There are constant reminders. The maritime industry is much more complex due to its types of vessels, cargoes and voyages. One factor that is critical to ‘Fatal Errors’ and problematic personal ‘Myths and Belief Constructs’ is fatigue and stress, particularly when crewing numbers are reduced to the absolute minimum.
Vessel specific understanding and communication about likely ‘Fatal Errors’ is not complex. Each ship could discuss and post their own list of ten or more. This can be easily augmented by ship wide visual reminders.
Consideration of the belief constructs or ‘Myths’ that individual watch keepers hold is more complex, as they are rarely observable. This crucial aspect needs to be widely communicated and discussed in both training and vessel operation. Watch keepers should be checking in now then asking the question “What are you thinking about right now?” or “Which Condition is appropriate now?” While from my own seagoing experience some of the responses received may be hilarious, the underlying message is being reinforced. Communicating about and application of the ‘Awareness Spectrum’ is not particularly difficult and could be supported by appropriate displays on the bridge and common areas.
Perhaps the critical question should be asked when on watch and particularly in complex vessel movements approaching ports at night is “What Awareness Spectrum are you in right now.