0% found this document useful (0 votes)
40 views45 pages

Navy - Understanding Human Factors Guide

Uploaded by

Tham Wai Man
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
40 views45 pages

Navy - Understanding Human Factors Guide

Uploaded by

Tham Wai Man
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/236686369

A GUIDE TO UNDERSTANDING HUMAN


FACTORS & HUMAN BEHAVIOUR In Safety
Management & Accident ....

Book · May 2013

CITATIONS READS

0 406

4 authors, including:

Robert S Bridger Peter Pisula


Royal Navy Royal Navy
128 PUBLICATIONS 1,140 CITATIONS 7 PUBLICATIONS 6 CITATIONS

SEE PROFILE SEE PROFILE

Some of the authors of this publication are also working on these related projects:

Occupational stress View project

Education in Human Factors and Ergonomics View project

All content following this page was uploaded by Robert S Bridger on 04 June 2014.

The user has requested enhancement of the downloaded file.


INM

EM EM
A R I A D S A LU T

A GUIDE TO UNDERSTANDING HUMAN


FACTORS & HUMAN BEHAVIOUR
In Safety Management & Accident Investigation
NAVYSAFE
LETHAL TO OUR ENEMIES - SAFE TO OURSELVES

FOREWORD
This guide has been prepared as part of the Navy Safety Improvement Programme
“NAVYSAFE” to help develop our understanding across the Naval Service of the
human factors and behaviours that contribute to accidents, incidents and near
misses.

It is intended as a ready reference document for all personnel in leadership


positions, from AB and Marine all the way up the management chain. First and
foremost it is designed to help prevent accidents but also to assist investigators
(after an event) to identify the root causes. In doing so we will be able to uncover
weaknesses in our environment, organisation and equipment design; take action;
learn from our experiences and ensure that we remain lethal to our enemies and
safe to ourselves.

In line with the First Sea Lord’s Safety Pledge, safety is everyone’s business and we
all have a role to play. So read and refer to this guide, use the examples to discuss
where you may have made errors or violations in the past and ensure we are
sensible about taking risk such that we remain an effective fighting force that is
risk aware, not risk averse.

Vice Admiral Philip Jones CB


Fleet Commander

A GUIDE TO UNDERSTANDING HUMAN FACTORS & HUMAN BEHAVIOUR In Safety Management & Accident Investigation
A GUIDE TO UNDERSTANDING HUMAN
FACTORS AND HUMAN BEHAVIOUR
In Safety Management and Accident Investigation
By R S BRIDGER, P PISULA, A BENNETT

2012 British Crown Copyright

This guide has been produced by the staff of the Institute of Naval Medicine and is
provided to support procedures for safety management and accident investigation
as described by BRd 9147.

A GUIDE TO UNDERSTANDING HUMAN FACTORS & HUMAN BEHAVIOUR In Safety Management & Accident Investigation
The aim of the guide is two-fold:

• to provide an understanding of Human Factors and Human Behaviour for all


personnel, at all levels, to help identify risk and prevent accidents and incidents
before they occur.

• to help accident investigators ask the right questions about accident causation.
This will enable them to better identify performance shaping factors in the
work environment and make recommendations for improvement.

“Human error is not random. It is systematically connected to features of


people’s tools, tasks and operating environment... Human error is not the
conclusion of an investigation. It is the starting point1”
1
Dekker, S. 2006. The Field Guide To Understanding Human Error. Ashgate Publishing, pp 236.

A GUIDE TO UNDERSTANDING HUMAN FACTORS & HUMAN BEHAVIOUR In Safety Management & Accident Investigation
CONTENTS
This guide is structured in three parts:

Part 1 - An Introduction to Human Factors and Human Behaviour.

Part 2 - Classification of Human Factors within accidents: Errors and Violations.

Part 3 - Identifying Human Factors and Human Behaviours after an accident.

This guide is intended to supplement the training received by personnel during


Career courses and act as an ‘aide-memoire’ for routine planning and management
of activity. To achieve this aim, real-life examples are provided which detail faults in
equipment design, environmental hazards and organisational failings which have
led to equipment damage and personal injury.

Note: Throughout this guide, accidents, incidents and near misses will be referred
to; they are collectively known as events and are defined as follows:

Accident: An undesired event resulting in death, ill health, injury, damage or other
loss.

Incident: An event that gives rise to an accident or had the potential to lead to an
accident. An incident where no death, ill health, injury, damage or other loss occurs
is also referred to as a “near miss”. The term “incident” includes “near miss”.

Near miss: An event that, while not causing harm, had the potential to
cause death, ill health, injury, damage or other loss but which was avoided by
circumstance or through timely intervention.

A GUIDE TO UNDERSTANDING HUMAN FACTORS & HUMAN BEHAVIOUR In Safety Management & Accident Investigation
PART 1 - AN INTRODUCTION TO
HUMAN FACTORS AND HUMAN
BEHAVIOUR
The Scope of Human Factors
The diagram below depicts the general scope of Human Factors. It highlights that
an individual’s behaviour may be influenced by the environment (the physical
world), the organisation of their work and the design of machines, equipment,
software and workspaces.

Organisation Environment

Individual

Design

Figure 1 - Where to Look For Human Factors

Within these domains there can be a number of different areas that can affect
Human Behaviour.

Environment
The environmental domain focuses on environmental conditions such as noise,
lighting, temperature and humidity. The presence of environmental hazards such as
high sea states, chemicals and radiation are also included in this domain.

Organisation
The organisation domain focuses on the way operations and tasks are organised.
This includes examples such as: how units are manned; what instructions are given
to personnel; what levels of training are considered necessary to complete a task;
the type of watch system used; the level of supervision required for a task and the
creation and application of Standard Operating Procedures (SOPs).

A GUIDE TO UNDERSTANDING HUMAN FACTORS & HUMAN BEHAVIOUR In Safety Management & Accident Investigation
Design
The design domain focuses on workplace ergonomics and the design of equipment,
user interfaces and software such as the location of a valve operating point, the
arrangement of an operating console or the layout of a computer screen (graphical
user interface).

Personnel involved in Safety Management should consider the effect that


environmental, organisational and design factors may have on an individual
performing a task. Accident Investigators should focus not only on what personnel
did, but at the situation personnel were in at the time the accident happened.
Investigators should ask what environmental, organisational and design factors
played a role, and how these affected the personnel involved.

Note:
Appendix A gives practical examples of environmental, design and organisational
factors that can influence behaviour and lead to errors and accidents.
Appendix B gives some examples of Human Factors in real accidents.

Human Behaviour
The contemporary view of human behaviour is that human error is not the cause
of failure - rather it is an effect or symptom of a deeper trouble. After an accident
has occurred focus must therefore extend beyond ‘…what occurred?’ to ‘…why
did it occur?’ This particularly applies to accident investigation where priority must
be placed on understanding why errors occurred or why personnel behaved in an
unexpected manner.

The Accident Chain


Accidents are usually the end point of a series of events in which the situation
becomes increasingly unsafe. Organisations erect multiple barriers to prevent
accidents and maximise safety, but none are perfect. By looking beyond the
immediate cause, back from the time the accident occurred and outwards, to
the wider context, accident investigators can often identify weaknesses at the
organisational level from which useful lessons can be learned. Figure 2 illustrates
how events can unfold in the form of an accident trajectory (known as the “Swiss
Cheese” model).

In almost all accidents, personnel are a key part in this accident chain. Human
behaviour is never constant but the actions of individuals can often be attributed
to the environment in which they are placed. An understanding of these Human
Factors within design and organisation and a dynamic assessment of them against
the current environment can greatly assist in the prevention of accidents.

A GUIDE TO UNDERSTANDING HUMAN FACTORS & HUMAN BEHAVIOUR In Safety Management & Accident Investigation
Inadequate defences

Local failures

Fallible board
decisions and policy

ACCIDENT

Accident trajectory
Unsafe acts

Line management problems

Figure 2 - ‘Swiss Cheese’ Model of Accidents


(Redrawn from Reason, J. 1990. Human Error. Cambridge University Press, Cambridge, UK.)

Note:
Appendix C presents case studies to assist with the understanding of Figure 2
and its implications for accident investigation. These case studies illustrate how
organisational policies and practices can make it possible for latent failures to exist
in the workplace causing accidents to occur when the remaining safety systems fail.

Safety Culture

As well as explaining the behaviour of individuals in the context of Human Factors,


the shaping of events should also be considered against the influence of the local
and organisational Safety Culture.

Safety Culture is the term used to describe the shared attitudes and beliefs
about safety and safety related activity within an organisation.

For example, whether personnel perceive that they work in a hazardous occupation;
whether they feel confident to speak out when things don’t seem right; whether
safety is rewarded and recognised or whether the focus is just on getting the job
done at any cost. Often, hazards linger in the workplace and eventually cause an
accident because they were not recognised or reported at an earlier time.

To fully understand the Accident Chain, the influence of Safety Culture at all stages,
before and after an event, should be carefully examined.

A GUIDE TO UNDERSTANDING HUMAN FACTORS & HUMAN BEHAVIOUR In Safety Management & Accident Investigation
PART 2 - CLASSIFICATION OF HUMAN
FACTORS WITHIN ACCIDENTS:
ERRORS AND VIOLATIONS
Recognising the background presented by these Human Factors, the action of
personnel still requires explanation. Basic behaviours in the context of accidents can
be defined as Errors or Violations as described below:

ERRORS (Unintentional Action)

Errors in Action – associated with familiar tasks that may not require much
attention. These skill based errors can occur if the attention is diverted from the task
and are often a sign of fatigue or distraction by overload.

• Slip (Commission) – carrying out an incorrect action or task: for example,


entering the wrong heading into the autopilot; deleting instead of saving a file;
taking a reading from the wrong instrument.

• Lapse (Omission) – failure to carry out an action or task when action was
required: for example, failing to check the condition of ropes used for towing;
failure to check that all were seated safely in a RIB before moving; failure to
turn-off the electrical power supply before undertaking repairs to a piece of
equipment, missing a crucial step in a safety-critical procedure.

• Psychomotor – Accidently operating a control or changing the state of a


component through clumsiness: for example, knocking oneself unconscious
while handling awkward loads in a confined space with limited headroom; man
overboard due to loss of balance in high seas.

Errors in Thinking – involve mental processing linked to planning, gathering


information and communication.
• Rule-based error - Successful task performance often requires that personnel
follow simple rules of the ‘If ‘X’, then ‘Y’ variety. Errors can occur when
personnel either do not know the rule, when the situation changes and the
usual rules do not apply or when personnel do not receive the information they
need to act on the basis of the rule. For example: An individual is very familiar
with filling a tank, it usually takes 30 minutes. However, the individual does not
know that the size of the inlet pipe has been enlarged and the tank now fills
more quickly. After 15 minutes the gauge indicates that the tank is full, but the
individual ignores it, thinking it is faulty. The tank overflows. The individual is
applying a rule, but the context has changed and the rule no longer applies.

A GUIDE TO UNDERSTANDING HUMAN FACTORS & HUMAN BEHAVIOUR In Safety Management & Accident Investigation
• Knowledge-based errors - Mistakes and poor judgment are examples of
knowledge-based errors. In many situations, we may have all the necessary
knowledge to deal with a problem but we fail to use our knowledge correctly.
Fatigue, time pressure, a lack of communication and many other human factors
may cause personnel to act before they have all the facts needed to make
the right decision. For example: A man is injured when removing the lid of a
drum using a burning torch. He had not been told that the drum contained
flammable liquid which exploded when the torch was applied.

VIOLATIONS (Intentional Action)

A violation is explained as a conscious action by an individual which did not conform


to policy instructions or standard procedures. This involves a deliberate deviation
from the rules, and in some cases this non-compliance with the rules can become
the ‘norm’. There may be several explanations for violations:

Routine violations - not following the rules/procedures in a usual operating


environment.
• Situational Violation: rules could not be followed due to situation-specific
factors e.g. excessive time pressure, unsuitable tools to complete the task, SOPs
do not relate to the task at hand.
• Violation for Organisational Gain: deliberately ignoring the rules while
trying to support the organisational objectives, e.g. ignoring safety procedures
in order to sail on time.
• Violation for Personal Gain: deliberately ignoring the rules to save personal
effort, e.g. finish early or ‘show-off’.
• Recklessness: ignoring risks and the potential consequences for themselves
and others, e.g. a RIB being driven unsafely in a high sea state for enjoyment.
• Sabotage: occurs when there is the intent for both action and consequence,
e.g. malicious damage.

Situational Violation: rules could not be followed due to situation-specific factors


e.g. excessive time pressure, unsuitable tools to complete the task, SOPs do not
relate to the task at hand.
Exceptional violations - not following the rules in unforeseen or highly unusual
circumstances. Often this occurs when something has gone wrong. To solve a new
problem you break a rule, even though you know you are taking a risk.
The procedure for how to identify each of these errors or violations when
investigating accidents can be found in Part 3 of this guide.

A GUIDE TO UNDERSTANDING HUMAN FACTORS & HUMAN BEHAVIOUR In Safety Management & Accident Investigation
PART 3 - IDENTIFYING HUMAN
FACTORS AND HUMAN BEHAVIOURS
AFTER AN ACCIDENT
BRd 9147 and BRd 172 (the Yellow guide) provide guidance for conducting safety,
health and environment accident/incident investigations.

When seeking to understand the contribution of Human Factors to an accident this


guide proposes a two stage approach:

• Stage 1 - Identification and Classification of Human Behaviour


1. Did an error or violation contribute to the accident?
2. Did Human Factors increase the risk of the error/violation occurring?
3. Why did these Human Factors exist in the first place?

• Stage 2 - Consideration of how to prevent recurrence.


1. If Human Factors contributed to the accident, what can we do to remove
them to prevent recurrence?
2. How can we shape future Human Behaviour to prevent recurrence?

A GUIDE TO UNDERSTANDING HUMAN FACTORS & HUMAN BEHAVIOUR In Safety Management & Accident Investigation
STAGE ONE: IDENTIFICATION AND CLASSIFICATION OF
HUMAN ERROR AND HUMAN BEHAVIOUR
Stage one aims to identify and classify errors/violations, human factors and root
causes. This involves answering three questions using Figures 4 to 6 as a guide and
should be done in the steps shown below:

Q1. Was there an error or a


violation?
The answer to this question Use Figure 4 to
will provide a broad answer this question.
categorisation of behaviour.
ERROR OR VIOLATION

Q2. What Human factors


contributed to the error /
violation occurring?
The answer to this question
Use Figure 5 to
will require a micro-analysis
answer this question.
of the accident against the
context of the immediate scene
and the sequence of events.
HUMAN FACTORS

Q3. Why did these Human


Factors exist in the first
place?
The answer to this question
will also provide numerous
secondary questions and a Use Figure 6 to Move to
macro-analysis of the accident answer this question. Stage 2
against the context of the
wider organisation, and
potentially latent, issues.
ROOT CAUSES

Figure 3 - Identification and Classification of Human Error and Behaviour.

A GUIDE TO UNDERSTANDING HUMAN FACTORS & HUMAN BEHAVIOUR In Safety Management & Accident Investigation
Q1. WAS THERE AN ERROR OR A VIOLATION?
To answer this question the following flow chart should be considered (definitions
for these can be found in Part 2 of this guide):

Did an error or No Was there a


violation occur? system failure?

Yes No

Yes
Review Unforeseeable
procurement & occurrence
maintenance
procedures True accident

ERROR OR VIOLATION?

ERROR VIOLATION
The action was unintentional The action was intentional

Error in Action Error in Thinking Routine Situational Exceptional

Slip Situational
Rule based
(Commission) rule breaking

Lapse Knowledge Violation for


(Omission) based Organisational Gain

Psychomotor Violation for


Personal Gain

Recklessness

Sabotage

Figure 4 - Classification of errors and violations

Typical questions for this process are (not exhaustive):


• What tasks were taking place at the time of the accident/incident?
• What information did the person have at the point of occurrence of the accident/
incident?
• Could personnel adhere to procedures?
• Were the conditions outside normal practice, i.e. did the personnel find themselves
in an environment that differed from the normal operating environment?
• Was there anything different from normal that day?

A GUIDE TO UNDERSTANDING HUMAN FACTORS & HUMAN BEHAVIOUR In Safety Management & Accident Investigation
Q2. WHAT HUMAN FACTORS CONTRIBUTED TO THE ERROR
/ VIOLATION OCCURRING?
To answer this question the following flow chart should be considered:

Q1. ERROR/VIOLATION OCCURRED

Consider whether the following Human Factors were present

ENVIRONMENT ORGANISATION DESIGN

Extremes of heat/cold Fatigue Workstation layout


Excessive noise Watch systems Too many controls
Confined space High time on task Poor displays
High sea state Poor team work Console design
Poor lighting Communication problems Presence/absence of warning
Toxic hazards Inadequate maintenance signs
Flammable materials Poor record keeping Screen layout
Weather conditions Conflicting goals System response time
Poor instructions Adequacy of feedback
Other Poor sightlines
Time pressure
Lack of supervision Visibility
Lack of training Number of Warnings /
Alarms
No SOPs in place

Other Other

Figure 5 - Flowchart to identify what risk factors were present

Typical questions for this process might be (not exhaustive):


• What were the environmental conditions?
• Were the levels of lighting and noise appropriate for the task?
• Was the environmental temperature appropriate?
• Did the weather conditions create a different environment from normal?
• Is there evidence of sleep deprivation or fatigue?
• Time of day – was the operator starting or finishing a watch?
• How many watches did the operator do previously?
• When was the operator’s last rest day?
• Had those involved had adequate food and hydration in the time prior to
the accident/incident?

A GUIDE TO UNDERSTANDING HUMAN FACTORS & HUMAN BEHAVIOUR In Safety Management & Accident Investigation
• Is there evidence of poor teamwork or communication?
• Were clear instructions given?
• Was the goal of the task clearly explained?
• Was there adequate support or supervision present?
• Was adequate time available or was there conflict with other tasks?
• Was the workspace configured appropriately / normally at the time of the
accident?
• If hazards were present, were they properly identified and understood?
• Could the operator see all the necessary controls and displays while carrying out
the task?
• Were the controls designed to support ease of use and transparency of
operation?
• Was the operator able to reach and operate all the necessary controls?
• Could they be operated comfortably for the duration of the task?
• Was communications equipment adequate?
• Were any warning lights/audible warnings present at the time?
• Had anything changed in the workplace recently? E.g. layout of workspace,
introduction of new equipment?

Q3. WHY DID THESE HUMAN FACTORS EXIST IN THE FIRST


PLACE?
Many of the Human Factors identified in Figure 5 may have root causes – it is
important to consider why these Human Factors existed. Figure 6 (opposite) gives
examples of some Human Factors shown in Figure 5 and the root causes that may
explain their presence in the workplace at the time the accident took place. In some
cases more than one Human Factor may have a single root cause (for example, poor
safety culture).

If Human Factors (in Environment, Organisation or Design) were present, consider


the following root cause analysis (not exhaustive):

A GUIDE TO UNDERSTANDING HUMAN FACTORS & HUMAN BEHAVIOUR In Safety Management & Accident Investigation
Q3. ROOT CAUSE
ANALYSIS
Inadequate
maintenance
procedures
Poor leadership or
management Q2. HUMAN
FACTORS PRESENT
Inadequate Poor
Extreme heat maintenance communication
(ENVIRONMENT) (ORGANISATION) of safety
management
Absence of
Lack of training warnings
(ORGANISATION) (DESIGN)

Q1. ERROR/VIOLATION
OCCURRED
No SOPs in place, Equipment
no clear safety EVENT malfunction/
procedures, safety adequacy of
rules not enforced feedback (DESIGN) Poor quality
(ORGANISATION) control

Poor teamwork, Too many controls/


Poor safety safety rules ignored, controls that
culture poor record keeping Fatigue, high contradict each
(ORGANISATION) time on task, other (DESIGN)
excessive
time pressure
(ORGANISATION)
Poor system
integration

Inadequate
manning/
organisation of
personnel resource

Figure 6 - Analysis of Human Factors and their root causes

Typical questions for this process might be (not exhaustive):


• Was the accident connected to a wider sequence of events?
• Was this a secondary effect of another change?
• A change in policy, manning, resource?
• Was this an extraordinary or new evolution?
• Were management objectives clear or did this create conflicting demands?
• Had similar circumstances occurred previously?
• Were standards and practises adequate?
• Were there high levels of work stress?
• Was morale good at the time the accident took place?

A GUIDE TO UNDERSTANDING HUMAN FACTORS & HUMAN BEHAVIOUR In Safety Management & Accident Investigation
STAGE TWO: CONSIDERATION OF HOW TO PREVENT
RECURRENCE
Stage 2 seeks to identify what needs to be done to prevent recurrence of the
accident. The primary questions which must be asked in Stage 2 are:

Q1. If Human Factors contributed to the accident, what


can we do to remove them to prevent recurrence?
Environment
• Can the environment be changed to reduce hazards? For example:
The lighting levels in the room where the equipment was located were low and
encouraged users to touch type rather than look at the keyboard. This made keying
errors more likely due to slips of attention and lapses of memory. For poorly lit
spaces, design equipment that does not place high demands on vision OR improve
the lighting.

Organisation
• Can organisational factors be changed to reduce hazards? For example:
Despite numerous complaints from operators about the unfamiliar keyboard, it was
assumed that users would ‘soon get used to it’ and it was ‘their job to enter the
correct codes anyway’. At the organisational level, improve the feedback from users
to designers and procurement specialists at DE&S, including contractors. Ensure
that end-user feedback is exploited in a continuous improvement process over the
equipment life cycle.

Design
• Can the equipment/workplace be made safer by changing the design? For
example:
An accident occurs because personnel entered the wrong codes into an automated
system. The keyboard had an alphabetic layout instead of a QWERTY layout.
Replace the keyboard with a QWERTY keyboard that personnel are familiar with.

A GUIDE TO UNDERSTANDING HUMAN FACTORS & HUMAN BEHAVIOUR In Safety Management & Accident Investigation
Q2. How can we shape future Human Behaviour to
prevent accidents and incidents?
Better Safety Culture: Encourage personnel to follow safety rules within a just
and fair culture where all feel able to raise concerns about equipment design,
operating procedures, training etc.

Better Supervision: Focus on getting the job done safely and not just on getting
the job done. Supervisors should ensure that all are aware, not only of the accidents
that might happen, but also, the likely consequences of those accidents.

Better perception of risk: Consider how human fallibility can interact with poor
working conditions to cause accidents. Learn how to recognise these factors and
take action before an accident or incident occurs.

Better leadership: Consider how leadership can be used to encourage safety


awareness, behaviour and culture.

Better communication and feedback: Focus on communicating risks and


accident feedback to inform behaviour if a similar situation arises. An accident may
be prevented by early identification of hazards. Report all accidents, incidents and
near misses.

Error is common, accidents are rare. People make errors


all the time, but in well-designed systems, nothing
usually happens.
Policy-Related Documents

BRd 2 Queen’s Regulations for the Royal Navy


JSP 375 MOD Health and Safety Handbook
JSP 418 MOD Sustainable Development and Environment Manual
JSP 832 Service Inquiries
BRd 9147 Navy Command Safety and Environment Management System
BRd 172 Guide to Ship’s Investigations and Royal Marine Unit Inquiries (The Yellow Guide)
BRd 167 SHE Manual for HM Ships and Submarines

A GUIDE TO UNDERSTANDING HUMAN FACTORS & HUMAN BEHAVIOUR In Safety Management & Accident Investigation
APPENDIX A
EXAMPLES OF HUMAN FACTORS THAT
COULD CONTRIBUTE TO AN ACCIDENT
ENVIRONMENT

Poor visibility. Risk of injury due to slips, trips and falls, anxiety and stress
reactions, damage to equipment. Consider how well personnel were briefed,
whether their training was in date and whether there were a sufficient number of
trained personnel in the team.

A GUIDE TO UNDERSTANDING HUMAN FACTORS & HUMAN BEHAVIOUR In Safety Management & Accident Investigation
Working in the heat. If not managed correctly, may result in dehydration and
fatigue which can affect cognitive processes, increase the likelihood of error and
the risk of dizziness and fainting. Why are personnel carrying out this task at the
hottest time of day? Is there a supply of drinking water nearby? Is the work being
carried out in accordance with official guidance (JSP 539 ‘Climatic Injuries in the
Armed Forces’).

A GUIDE TO UNDERSTANDING HUMAN FACTORS & HUMAN BEHAVIOUR In Safety Management & Accident Investigation
WORK ORGANISATION
Time pressure. Were personnel under time pressure when the accident
happened? Was this a result of poor planning, equipment failure or
unanticipated events?

Unsafe Sea Boat driving. Consider the factors that


led the coxswain to drive the boat in this fashion: time
pressure; training, lack of awareness of hazard.

A GUIDE TO UNDERSTANDING HUMAN FACTORS & HUMAN BEHAVIOUR In Safety Management & Accident Investigation
DESIGN
Badly designed workspaces.
Accidents and injuries can happen
when there is a mismatch between
the physical dimensions of the
work environment and personnel.
Slips, trips and falls and head
injuries are examples. The mismatch
between personnel and their work
environment and time pressure can
interact such that personnel take
unsafe measures to complete a task.

Console layout. Dials and gauges should be easy to read to the required level of
accuracy. Vigilance task overload can occur when having to maintain continuous
heads-up stance; forward visual displays containing machinery and navigation data
may be an appropriate design solution.

A GUIDE TO UNDERSTANDING HUMAN FACTORS & HUMAN BEHAVIOUR In Safety Management & Accident Investigation
APPENDIX B
EXAMPLES OF REAL ACCIDENTS
There is a great deal of evidence from a variety of fields that basic workplace
ergonomic failings can increase the risk of human error and the likelihood of
accidents in settings as diverse as health care and nuclear power. In the Royal
Norwegian Navy, Gould et al. investigated 35 accidents involving fast patrol
boats. Some examples of accidents are given and the reader is advised to relate
the contents of the narratives to the different components of Figure 3 in order to
understand how the model can be used as an aide-memoire to support a human
factors focus in accident investigation in the field.

Lack of training/lack of supervision: An inexperienced navigator lost control


over his exact position. Failing to observe a waypoint, he was late turning. The
vessel hit a submerged rock. This is a knowledge-based error caused by a lack of
proper training and supervision.

Work Organisation/ineffective teamwork: Two boats in an exercise were


unaware of each other’s position due to lack of radar/lantern use. The CO of one
boat failed to inform the navigator of their relative positions. The lookout on
one boat was visually impaired and unaware that he was supposed to be on duty,
believing his main task was to man the gun. The boats collided. This is an error of
omission caused by a failure to follow correct procedures.

Quality of Bridge Design: A coast guard vessel grounded when the retractable
sonar dome was left out by mistake following a crew change. The sonar indicator
was only visible from one side of the bridge, leaving the navigator unaware if the
ship had increased depth. The dome was damaged entering shallow waters. This is
an error of omission (failure to monitor) caused by poor design of the workplace.

Fatigue/Work Organisation/Environment: A single patrol boat crew was ordered


to sail during a rest period in foreign waters. The crew had been awake for 48
hours and the previous rest period had been disturbed by high seas. The navigator
misjudged two lights and consequently ordered a wrong turn. This is an error of
commission caused by fatigue and environmental factors (high seas).

Displays/Bridge layout: The cruise liner ROYAL MAJESTY is grounded off


Nantucket Island in 1996. After the ship set sail from Bermuda bound for New York,
it dropped the harbour pilot off and the Navigator compared their position on
the GPS (Global Positioning System) against the Loran-C (a radio based navigation
system designed to provide data along the coast of the United States) and found

A GUIDE TO UNDERSTANDING HUMAN FACTORS & HUMAN BEHAVIOUR In Safety Management & Accident Investigation
it to be well within tolerable limits. Shortly before reaching New York the ship ran
aground on Nantucket Island, having drifted 15 miles off-course. The grounding
occurred because, shortly after leaving Bermuda, the GPS connector cable from the
antenna had come loose and the autopilot had defaulted to dead-reckoning mode.
There was nothing on the main bridge display to indicate that this had happened.
A small maintenance console in a corner of the bridge did have a display to indicate
the state of the system but, because the main display indicated that all was well,
there was no obvious reason to check the maintenance console while underway.
This is an error of omission (failing to check the GPS connection) caused by bad
design and by bad drafting of SOPs.

Gould KS, Knappen Røed B, Koefoed VF, Bridger RS PhD, Moen BE. 2006. Performance-
shaping factors associated with navigation accidents in the Royal Norwegian Navy, Military
Psychology (Suppl), S111-S129.

A GUIDE TO UNDERSTANDING HUMAN FACTORS & HUMAN BEHAVIOUR In Safety Management & Accident Investigation
APPENDIX C
TAKING A WIDER VIEW:
LOOK BACKWARDS AND OUTWARDS TO
IDENTIFY LATENT FAILURES THAT MADE IT
POSSIBLE FOR THE ACCIDENT TO HAPPEN
Flooding in HMS ENDURANCE2
16 December 2008
HMS ENDURANCE was operating in the South Atlantic when she suffered severe
flooding in the Engine Room, prompting damage control efforts by the Ship’s
company and resulting in near loss of the ship.
The Service Enquiry concluded that the flooding was due to an inadvertent opening
of a hull valve during the cleaning of an inlet strainer. There was no necessity to
clean the strainers at sea, this operation could have been performed before sailing
or on arrival at the next port. Incorrect reconnection of control airlines is likely to
have caused the inadvertent opening of the valve. The first time it was disconnected
and reconnected this was undertaken correctly by two different persons, on the
second occasion it was undertaken incorrectly by a third person.
Key contributory factors identified included: the absence of a responsible trained
maintainer, the inability to maintain engineering standards, poor procedures,
inadequate risk assessment and inadequate risk mitigation. Additionally, long
deployments in isolated locations had not factored into manning organisation, with
the result being up to 33% gapping. Design problems and lack of communication of
these problems were also cited as contributory factors.
Following Human Factors principles and the structure provided in the Guide, you
can work backwards from the event to identify the behaviour behind the accident,
contributing Human Factors and root causes. Figures 7 and 8 illustrate Stage 1 of
the Guide for the example of HMS ENDURANCE with an explanation of these shown
below.

Q1. Was there an error or a violation?

The person who reconnected the valves was unaware that his reinstallation was
incorrect. His action was unintentional and he carried out an incorrect action,
therefore this would be classified as a slip, an error in action.

2
Service Enquiry into the Flooding of HMS Endurance 16 December 2008 (Restricted)
A GUIDE TO UNDERSTANDING HUMAN FACTORS & HUMAN BEHAVIOUR In Safety Management & Accident Investigation
Q2. What Human Factors contributed to the error/violation occurring?

Six Human Factors were identified as contributing to the error, all within the
Organisational and Design domains. These were:

• Insufficient skill/experience (ORGANISATION)


• Time pressure to re-install lines (ORGANISATION)
• Unnecessary decision to clean filters at sea (ORGANISATION)
• Lack of communication (ORGANISATION)
• Design flaw (DESIGN)
• Insufficient assessment of risks (ORGANISATION)

Q3. Why did these Human Factors exist in the first place?

Once the 6 factors were identified, you can work backwards and outwards to
determine WHY these existed.

For example:
The Human Factor contributor of Insufficient skill/experience (within the
Organisational Domain) existed because there was a poor level of supervision of an
unqualified operator.
Why?
There was insufficient manning resource to support adequate supervision.
Why?
There was hybrid manning procedures in place (flexible managed gapping routine).
Why?
A decision to increase deployment length was made. To allow for mandated
harmony requirements the ‘managed gapping’ routine was adopted but this was
not identified as a risk.
Why?
The cumulative risk within the Manpower, Equipment, Training and Sustainability
pillars was not identified and, outside of the ship, there was no clear owner of this
cumulative risk (ROOT CAUSE).

Once this process is complete, this can be mapped onto Figure 8 to provide a
summary of the classification, contributory Human Factors and their root causes.

A GUIDE TO UNDERSTANDING HUMAN FACTORS & HUMAN BEHAVIOUR In Safety Management & Accident Investigation
A GUIDE TO UNDERSTANDING HUMAN FACTORS & HUMAN BEHAVIOUR In Safety Management & Accident Investigation
Event Valve opens unexpectedly & major flood occurs
Q1: Was there
an error or a Incorrect re-installation of air lines (Error in Action-slip)
violation?
Q2: What
Human Factors Time pressure to Unnecessary Lack of Design flaw Insufficient
contributed to the Insufficient skill/ re-install lines decision to clean communication assessment of risks
experience (ORG) (DESIGN)
error/violation (ORG) filters at sea (ORG) (ORG) (ORG)
occurring?
Incorrect emphasis Insufficient Role or Lack of awareness
on importance of knowledge of responsibility of possible design Poor working
ballast tanks state of sea water confusion flaw practices
system onboard
Insufficient Lack of continuation Possible lack of Poor engineering
Q3: Why did experience or / induction/ PJT Gapping of posts/ communication of
temporary cover standards &
these Human knowledge training & specific flaws to Platform complacency
Factors exist in training Team
the first place?
Poor supervision
Inadequate safety
emphasis
Hybrid manning procedures (flexible
Insufficient manning resources managed gapping routine)
Changes to deployment length
Q3: Root cause Lack of ownership of cumulative risk Poor safety culture
Figure 7 - Looking backward and outward from the event, classification, identifying contributory factors and root causes
Q3. ROOT CAUSE
ANALYSIS

Q2. HUMAN
FACTORS PRESENT

Insufficient skill/ Time pressure


experience to re-install lines
(ORGANISATION) (ORGANISATION)

Q1. ERROR/VIOLATION
Incorrect re-installation
Design flaw Lack of
of air lines – Error in
(DESIGN) communication
Action (slip)
(ORGANISATION)

EVENT:
Valve opens
unexpectedly
Lack of
ownership of
Insufficient Unnecessary cumulative risk
Poor safety assessment of risk decision to clean
culture (ORGANISATION) filters at sea
(ORGANISATION)

Figure 8 - Classification, identification of human factors and root causes (summary of Figure 7).

A GUIDE TO UNDERSTANDING HUMAN FACTORS & HUMAN BEHAVIOUR In Safety Management & Accident Investigation
Grounding of HMS NOTTINGHAM3
7 July 2002
HMS NOTTINGHAM was en route to Wellington, New Zealand after weighing
anchor at 20:57hrs from Lord Howe Island. She ran aground on Wolf Rock at
22:02hrs after changing course to stow a Lynx helicopter that had landed the
Commanding Officer (CO) returning from the Island.

The Board of Inquiry (BOI) concluded, inter alia, that:

HMS NOTTINGHAM grounded on Wolf Rock because insufficient attention was paid
by the Officer of the Watch (OOW) to the navigation of the ship and, in particular,
the navigation of the ship in relation to navigational hazards.
3
Board of Inquiry Report into the Grounding of HMS Nottingham at Wolf Rock,
Lord Howe Island, Australia on 7 July 2002.

A GUIDE TO UNDERSTANDING HUMAN FACTORS & HUMAN BEHAVIOUR In Safety Management & Accident Investigation
The Executive Officer (XO) and Navigating Officer (NO) had not ensured that a safe
navigational plan was constructed which ensured a safe departure from the island
and catered for the changes required for the recovery of the helicopter.

Working backwards from the immediate accident:

22:02:38. HMS NOTTINGHAM grounds on the western side of Wolf Rock.

22:02. The OOW was distracted by calls from the flight deck and engine room and
at 22:02 saw a ‘pale white glow’ when he looked out of the window. The NO also
saw this and went to check the chart. Realising the ship was in immediate danger, he
called to the OOW to change course but 5 seconds later, the ship grounded.

Immediately before the accident the OOW and the NO had a lengthy discussion
about the correct procedure to shut down an engine. This, according to the
BOI, distracted him from his ‘primary duty’ of navigation and maintaining a

A GUIDE TO UNDERSTANDING HUMAN FACTORS & HUMAN BEHAVIOUR In Safety Management & Accident Investigation
proper lookout. With the OOW fully engaged dealing with the helicopter, whose
responsibility was the navigation?

In the minutes leading up to the grounding, the OOW was pre-occupied with
stowing the Lynx and the pitch and roll of the ship. In the later interview he said
he was “petrified of losing or damaging the Lynx”. While distracted with the
helicopter, the OOW assumed the NO would take care of the navigation, but this
was not verbally communicated.

22:00. Neither the OOW nor the NO noticed that the 2OOW had fixed the ship 4
cables South East of Wolf Rock and that it was heading directly towards it at 12
knots. The 2OOW had drawn part of the fix over Wolf Rock on the map, completely
obscuring it from view. He did not communicate this fix to the OOW or NO and was
not supervised by either.

21:55. The XO asked the NO his intentions for getting back on track (to
Wellington). The NO stated he wished to get to the lee side of the island to stow
the helicopter. NOTTINGHAM changed course to North West 350 degrees, then 320
degrees. The NO did not check this new course by any means.

21:53. Lynx lands safely with CO.

21:49. Ship alters course again to 235 degrees.

21:44. The ship alters course to 230 degrees as the XO thought this would be a
good course for rendezvous with the Lynx, leaving Lord Howe Island safely on the
starboard bow. This course was checked on the 1:150000 scale chart by the OOW,
NOTTINGHAM was now 2nm away from Wolf Rock, with no significant safety
considerations in place.

21:25. NOTTINGHAM changes course from east-west to 140 degrees en route for
Wellington. The new course was not checked for hazards visually, by radar or by
chart. The NO returned to the Bridge at 21:37.

At no time between getting under way at 20:57 to the grounding at 22:02 did the
OOW or the NO refer to the chart or track, take a fix or ask for a fix to be reported
to them. Prior to weighing anchor at Lord Howe Island, Wolf Rock had not been
identified as a significant danger or ‘hatched-off’ on the chart.

Despite being 300 yards from the limiting danger line, neither Special Sea Dutymen,
Tiller Flat personnel nor Blind Pilotage Safety Officer were closed up, nor was the
echo sounder switched on.

A GUIDE TO UNDERSTANDING HUMAN FACTORS & HUMAN BEHAVIOUR In Safety Management & Accident Investigation
The CO made a last visit to the island at 20:05 having approved the Navigator’s plan
for the passage to Wellington. He instructed the NO to ‘stay out to the East’ and
the XO to ‘carry on down the navtrack’ and pick up the Lynx which were conflicting
instructions. After some discussion, the XO and NO agreed to weigh anchor and
head East. The position of Wolf Rock had not been entered into the electronic
navigation aids, command system or command support system.

Looking Backwards and Outwards

The report states that Wolf Rock had not been identified as a significant danger
when the ship was at anchor at Lord Howe Island. It had not been ‘hatched-off’ on
the chart. The report indicates that this does not reflect a failure of RN navigational
training but rather that the correct standards of bridgemanship, navigation
planning and execution were not maintained. Why, with 4 officers on the Bridge
at the time of the grounding, was this allowed to happen? There are a number of
questions that are not fully answered through the report – some possible reasons
for the behaviour on the bridge are offered overleaf.

A GUIDE TO UNDERSTANDING HUMAN FACTORS & HUMAN BEHAVIOUR In Safety Management & Accident Investigation
The echo sounder was not switched on when the ship weighed anchor, nor were
additional safety procedures put in place, why? Decisions made on that day,
reflected a team willing to take unnecessary risks with the ship, against common
practice. The behaviour on the bridge suggests that routine violations (not
complying with standard practice) were commonplace.

The OOW had assumed that the NO was overseeing navigation on the ship whilst
the OOW was concerned with the Lynx stowage in the minutes leading up to the
accident. This was not communicated between the OOW and the NO. Why did the
OOW assume that the NO would take charge, and why was this not communicated?
A few minutes prior to the grounding the NO advised a change of heading to the
OOW (without checking). This may have led to the OOW assuming that the NO
was monitoring the navigation. This role confusion could have been avoided;
it is possible that the NO had taken over navigating from the OOW in previous
instances, and so he assumed this would be the case. It is also possible that the
discussion regarding engine shut down left the OOW not wanting to communicate
with the NO for some reason. During his interview the NO stated that he had been
annoyed earlier by a change of heading that occurred without his consultation. It is
possible that this made him less likely to assist the OOW later on. The question as to
why this confusion occurred was not fully answered.

The 2OOW was not supervised, despite being ‘unqualified’. Being the only officer
to take a fix, would supervision of this task enabled Wolf Rock to have been seen
on the charts? It is possible that proper supervision may have enabled the OOW to
realise that there was a hazard that had not been identified. Why did the 2OOW
not report the fix to the OOW and the NO? It may have been a case of a routine
violation (it became normal for the 2OOW not to report fixes), a knowledge
based error (he didn’t know he had to report it) or an error of omission (he
forgot to report it). Why was the 2OOW seemingly unaware of the immediate
danger when he was looking at the chart? There is evidence of a lack of correct
supervision and leadership (Organisational failure).

The CO approved the Navigator’s plan despite serious omissions in the final plan, did
not follow correct procedures in the Sea Order Book or check the navigation plan
on the chart-why? He assumed the XO would undertake responsibility of executing
the plan to weigh anchor and recover the Lynx en route, but this was not explicitly
stated to the XO. Could these be examples routine violations (he’d done the same
previously and no problems occurred) or a lack of appreciation of the risks (lack of
safety culture)?

A GUIDE TO UNDERSTANDING HUMAN FACTORS & HUMAN BEHAVIOUR In Safety Management & Accident Investigation
Lack of safety culture, poor communication of safety management and poor
leadership are suggested as the root causes, based on the evidence of the report.

Additional Discussion Points

Passing mention is made of the rear layout of the chart table as not ‘conducive’ to
the monitoring of the ship’s progress. Could this have played a role in none of the
qualified bridge team paying attention to it for over an hour prior to the accident?
Although procedures should have been followed, they were not. The chart table
could not be used easily while monitoring the progress of the ship, which could be
addressed through a redesign.

Although the report states that a qualified OOW is reasonably expected to be able
to simultaneously recover the aircraft, navigate the ship and maintain a lookout,
the distraction by the Navigator at a crucial time and the apprehension at damaging
the aircraft may have proved too much for the OOW to cope with and lapses in
attention or errors of omission more likely.

Why was the Lynx continually used prior to weighing anchor, knowing that there
would be difficulties landing it on the return due to the swell? Did personal
priorities (getting time ashore) outweigh the risks associated with recovering the
Lynx in relatively unknown waters?

Why did the NO and the XO not check the charts before weighing anchor? This is
against standard practice, in violation of the procedures set in place to keep the
ship safe.

Why was there an apparent lack of Safety Culture and risk awareness on the
Bridge? The report details that previous OOW manoeuvres were conducted with
few additional precautions on 1 July and the report suggests that the conduct
was indicative of a team that was willing to take ‘unnecessary risks’ with the ship’s
safety. Lack of adherence to safety routines and safety practice are likely to indicate
routine violations throughout the chain of command.

Why was everyone, from the CO downwards, seemingly unaware of the existence
of Wolf Rock (lacking in knowledge)? Wolf Rock is named after ‘Wolf’ - an ex Royal
Navy gun brig, which was being used as a whaler and ran aground on the rock in
1837.

A GUIDE TO UNDERSTANDING HUMAN FACTORS & HUMAN BEHAVIOUR In Safety Management & Accident Investigation
Capsize of the Herald of Free Enterprise
6 March 1987
Organisations normally erect multiple barriers to stop accidents from happening.
Reason (Figure 2) reminds us that none of these barriers is perfect and that
an accident is often the end point of a process in which successive barriers are
defeated as events unfold. The focus on organisational factors requires the accident
investigator to look beyond the immediate work environment at the time the
accident took place and examine how the accident might have ‘escaped’ through
the web of barriers put in place to prevent it from happening. Some key areas
of focus include: training and team composition/selection; supervisory style and
safety culture; organisational policies and processes, including the composition
and function of safety committees and supervisors; risk reporting and reduction
and record keeping. Perceptions of safety at work, the extent to which operators
perceive the workspace as hazardous and the presence of perverse incentives for
unsafe behaviour are also important.

Applying Reason’s Swiss Cheese model to the ROYAL MAJESTY grounding we


can work backwards from the error (failure to check that the GPS was working
properly) to long before the ship was built. Clearly, the decision to locate the GPS
antenna connector display in a maintenance console, rather than the main console
on the bridge, reflects a perception on the part of the designers that this was a
maintenance issue and not an operational issue. At no stage during the construction
and testing of the vessel does this appear to have been questioned. It is noteworthy
that the officers on the bridge undertook all the standard checks that were required
of them before leaving Bermuda, checking the GPS connector was not one of them.
Therefore, a questionable design decision was compounded by inadequate drafting
of procedures, leaving a ‘latent failure’ that resulted in the accident happening
when conditions were right. The official report, however, focuses on what the
bridge should have done as the ship approached Nantucket Island - easy to say
with the benefit of hindsight. A human factors approach would focus on what
designers should have done and what the management should have done when
drafting the operating procedures for the vessel. At the time, ECDIS was a newer
piece of equipment than the land based radio system (LORAN-C) and the bridge
crews tended to regard it as the main source of navigational information and the
ground-based radar only as a back-up. Thus, they never thought to consult the
LORAN-C system before reaching New York and there was insufficient automation
and integration of equipment on the bridge to compare the positional information
available - technically this would have been easy to achieve.

The investigation into the ferry ‘HERALD OF FREE ENTERPRISE’ which capsized
outside the port of Zeebrugge on 6 March 1987 provides a good case study of the
way in which the Swiss Cheese model can be applied.

A GUIDE TO UNDERSTANDING HUMAN FACTORS & HUMAN BEHAVIOUR In Safety Management & Accident Investigation
The HERALD OF FREE ENTERPRISE was a roll-on/roll-off ferry designed for the
Dover-Calais route with double linkspans at both ports. Vehicles could be loaded
simultaneously onto G and E/F vehicle decks through vertically hinged watertight
bow doors. The bow doors could not be seen from the Bridge.

On the day of the accident, HERALD OF FREE ENTERPRISE was at Zeebrugge in


Belgium. She had not been designed for this port and there was only one deck for
loading vehicles. To load the higher decks, the bow ballast tanks were filled because
the ramp was not high enough to reach E deck. After all the vehicles had been loaded,
the tanks were NOT emptied, meaning the bow was lower in the sea than normal.

The assistant boatswain was supposed to close the G deck bow doors BEFORE the
ship slipped her moorings. He had, however, gone for a nap after cleaning the deck.

A GUIDE TO UNDERSTANDING HUMAN FACTORS & HUMAN BEHAVIOUR In Safety Management & Accident Investigation
The first officer was supposed to remain on G deck until after the doors had closed
but is believed to have been under pressure to get to his station on the Bridge,
believing that the assistant boatswain was on his way. The boatswain – the last
person on G Deck said that he did not close the doors because it was not his duty.

At 18:05hrs, believing that the bow doors had been closed and unable to see them
from the bridge, the Captain gave the order to depart. The ship had 80 crew, 459
passengers, 80 cars, 3 buses and 47 trucks.

At 18:24hrs, the ship entered the open sea. When the ferry reached 18.9 knots
(21.7 mph), water began to enter through the G deck doors. The ship capsized in
90 seconds. The electrical and power systems failed immediately, leaving the ship in
darkness. The ship floundered in shallow water onto a sandbar that prevented her
from sinking completely.

Despite immediate attempts at rescue, 189 people died. The water temperature was
3 degrees Celsius – most drowned when they became disabled as a result of their
cold immersion.

Conclusions of the main report. There were three main causes:


• Failure to close the bow doors
• Failure to check that they were closed
• Leaving port with the doors open

The following performance shaping factors should also be considered:


• Poor communication at all levels in the hierarchy
• Failure to empty the ballast tanks prior to departure
• Rejection at board level of the proposal to install a warning light on the bridge
• Hydrodynamic factors
• Bow wave above 18 knots
• ‘Squat effect’ in shallow water

EXTRACTS FROM THE OFFICIAL REPORT INTO THE HERALD


OF FREE ENTERPRISE SINKING
The following extracts from the official report into the accident should be studied
in relation to this case study, emphasising the need to consider the organisational
factors that cause risk factors for human error to be present in the work
environment.

A GUIDE TO UNDERSTANDING HUMAN FACTORS & HUMAN BEHAVIOUR In Safety Management & Accident Investigation
No Standard Routine for Loading the Ferries
Extract from the official report:
First, Captain Lewry merely followed a system which was operated by all the masters
of the HERALD and approved by the Senior Master, Captain Kirby. Second, the
court was reminded that the orders entitled “Ship’s standing orders” issued by the
Company make no reference, as they should have done, to opening and closing the
bow and stern doors. Third, before this disaster there had been no less than five
occasions when one of the Company’s ships had proceeded to sea with bow or stern
doors open. Some of those incidents were known to the management, who had not
drawn them to the attention of the other Masters.

Pressure to Leave Harbour Early

Personnel in the loading bay were under pressure to leave the bay as early as
possible and return to their harbour stations (the bridge, in the case of the 1st
officer). Extract from an internal memo dated 18 August 1986:

“There seems to be a general tendency of satisfaction if the ship has sailed two
or three minutes early. Where, a full load is present, then every effort has to be
made to sail the ship 15 minutes earlier . . . . .I expect to read from now onwards,
especially where FE8 is concerned, that the ship left 15 minutes early . . . . . put
pressure on the first officer if you don’t think he is moving fast enough. Have your
load ready when the vessel is in and marshal your staff and machines to work
efficiently. Let’s put the record straight, sailing late out of Zeebrugge isn’t on. It’s 15
minutes early for us.”

Inadequate Information on the Bridge: no indicator lights


On the 28th June 1985 Captain Blowers of the PRIDE OF FREE ENTERPRISE wrote a
memorandum to Mr. Develin. The relevant parts of the memorandum are these:

“In the hope that there might be one or two ideas worthy of consideration I am
forwarding some points that have been suggested on this ship and with reference
to any future newbuilding programme. Many of the items are mentioned because
of the excessive amounts of maintenance, time and money spent on them.”
“4. Mimic Panel - There is no indication on the: bridge as to whether the most
important watertight doors are closed or not. That is the bow or stern doors. With
the very short distance between the berth and the open sea on both sides of the
channel this can be a problem if the operator is delayed or having problems in
closing the doors. Indicator lights on the very excellent mimic panel could enable
the bridge team to monitor the situation in such circumstances.”

A GUIDE TO UNDERSTANDING HUMAN FACTORS & HUMAN BEHAVIOUR In Safety Management & Accident Investigation
Mr Develin circulated that memorandum amongst managers for comment. It was a
serious memorandum which merited serious thought and attention, and called for a
considered reply. The answers which Mr. Develin received are set out verbatim:

From Mr. J.F. Alcindor, a deputy chief superintendent: “Do they need an
indicator to tell them whether the deck storekeeper is awake and sober? My
goodness!!”

From Mr. A.C. Reynolds: “Nice but don’t we already pay someone!”

From Mr. R. Ellison: “Assume the guy who shuts the doors tells the bridge if there
is a problem.”

From Mr. D.R. Hamilton: “Nice!”

The official report concluded that:

“It is hardly necessary for the Court to comment that these replies display an
absence of any proper sense of responsibility. Moreover, the comment of Mr.
Alcindor on the deck storekeeper was either ominously prescient or showed an
awareness of this type of incident in the past. If the sensible suggestion that
indicator lights be installed had received, in 1985, the serious consideration which it
deserved, it is at least possible that they would have been fitted in the early months
of 1986 and this disaster might well have been prevented.

From the above, it is clear that the immediate causes of the accident were easily
identified: there were risk factors for human error in work environment at and
around the time the accident took place. Fatigue, pressure to leave early, poor
interpersonal communication and inadequate information on the bridge displays
are examples. However, the presence of these risk factors is evidence of a far deeper
malaise that permeated the entire company up to senior management level - a lack
of shared responsibility for safety, poor perceptions of risk and a failure to report
and circulate information about ‘near misses’ (previous incidents where ferries had
left with the doors open). Using the terminology of the Swiss Cheese model, it is
clear that very few barriers were put in place by the company to stop the ferry from
sinking after loading, probably due to excessive focus on turnaround times and a
lack of awareness of the nature of the risks.

A GUIDE TO UNDERSTANDING HUMAN FACTORS & HUMAN BEHAVIOUR In Safety Management & Accident Investigation
Rescue
Operations to rescue trapped passengers and crew, and recover the dead, involved
the Royal Navy; extracts from the Marine Accident Investigation Branch report4 state:

About 2000 HMS HURWORTH in Ostende sent her divers by road to Zeebrugge and
at 2020 BNS EKSTER sailed from Zeebrugge with more divers. The RN Clearance
Diving Centre at Portsmouth were alerted at this time.

By 2330 it was apparent that most of the survivors above water level had been
rescued and divers were organised to begin recovering bodies while still searching
for survivors.

At 0030 divers were despatched in an inflatable craft to hammer on the bottom of


the wreck because there was no obvious access to the engine room.

At 0115 three survivors were found in the forward drivers’ accommodation. It must
be assumed that these were the last to be found alive. Shortly after this plans of
the vessel arrived. Sub-Lieutenant Cox (HMS HURWORTH) organized a search with
the UK and Belgian clearance diving teams. At 0145 diving was again suspended
until more lights became available at 0215. Thereafter systematic searching of the
vessel continued. Helicopter movements were suspended to make it possible to
communicate and to listen for hammering.

4
Report Of Court No. 8074 Formal Investigation

A GUIDE TO UNDERSTANDING HUMAN FACTORS & HUMAN BEHAVIOUR In Safety Management & Accident Investigation
NOTES
fleetgraphicscentre 12/662
View publication stats

You might also like