You are on page 1of 6

Human Factor Analysis of The North Sea

Oil Rig Accident


Contents
Introduction..........................................................................................................................................3
Decision-making and errors..................................................................................................................3
Conclusion.............................................................................................................................................4
Reference list.........................................................................................................................................5
Appendix...............................................................................................................................................6
Introduction
On July 6th, 1988, in The North Sea, 167 men would lose their lives that evening. – (McGinty, 2018).
2145 hours, an alarm comes on indicating condensate Pump B has stopped. One pump must always
be running to power itself and the sister platform and it was the only pump running at the time
because Pump A was under routine maintenance. After finding the wrong Permit-To-Work (PTW)
that was not currently used, the manager concluded that it was safe to start up Pump A. However,
the suspended PTW document would state the pump must not be started under any circumstance
because of the missing valve, which was placed in the wrong location.

Furthermore, high-pressure gas would escape through the blind flange triggering several alarms. A
moment later it would ignite, explode, and destroy the firewall, the crews in the control room, which
was adjacent to the condensate room, were knocked off. Soon after, the rig's emergency stop
button was pressed, which should have stopped the flow of oil and gas, however, the condensate
pipe was ruptured from one of the flying panels creating another fire. There still were no alarms
warming the workers of the accident unfolding because the announcement system was destroyed.
Also, the automatic firefighting system was turned off and put on manual mode because divers were
working under the platform and it never started. The access route to turn on the firefighting system
was unprotected so the crew could not turn it on - (Macleod and Richardson. 2018).

The floor was grated so the oil was meant to drip straight into the sea but, rubber matting was
placed to cover the sharp grating so the divers would not step on it with their bare feet. When a
crude oil tanker ruptured due to intense heat, the rubber matting caught the burning oil dripping
from module B above it, which led to superheating a pipeline with high-pressure gas connected to
Tartan. It resulted in the pipeline rupturing releasing 15-30 tons of fuel for the existing fire - (Reid,
2020).

All the alarms were going off even though there was power because it was using a backup battery.
The control room and radio room were abandoned after the explosion. As the situation gets worse
so did the organisation of the crew disintegrated, there was no one to give any instructions; the crew
was disoriented and no announcements from the managers, or attempts to do so. There was no
order for an evacuation, no systematic attempt to lead the crew to safety, they were trapped and
waiting in an accommodation.

The managers did not feel empowered to decide for themselves, so they were waiting for instruction
from offshore. As a result, they did not stop pumping oil and gas to the platform feeding the fire, and
intensifying fire making it impossible for rescue boats and helicopters to get close. The crew would
jump into the sea from a height of 53 meters and then be rescued by boat.

Decision-making and errors


Classical and naturalistic are two types of decision-making, classical is a formal approach, where you
would use a set structure starting with recognising a problem, analysing by examining relevant data,
derive to a plan and its consequences, acting on it, and evaluating the implemented plan - (Hoy,
2019). This is a very rational and optimal decision because this theory seeks for multiple options to
achieve the best outcome. While a naturalistic decision is made using previous experiences because
it is derived from familiarity and long-term memory, unlike CDM. Although, the situation still needs
to be understood and worked into must plausible solution, and a new one is generated if it is not
suitable.
The decision to turn on Pump A was CDM; they followed the procedure but there was a selective
error and routine violation prior. The shift manager was not aware of the suspended PTW document
for the valve, and it was not in the correct location - (Macleod and Richardson, 2018). However, he
found the permit for the overhaul which led to him starting Pump A. Staff routinely sign PTW
without visiting the work site and lack of information communication at shift handover - (Cullen,
1991). Previously, the organisation was found guilty of an insufficient Permit-To-Work system but
decided to not improve the system.

The trainer did routine violations and qualitative errors and gave minimum training to new workers
if they had offshore experience. They were expected to learn on the job by watching others and
applying the NDM method when trouble arises. Some crew members were not even shown where
the life rafts were and how to operate them in an event of an emergency. This could be a lapse error
as the trainer was not training every crew member and had a lack of consistent content.

Safety and leadership on the platform were lacking, and common not to follow safety training
procedures. This is NDM as they were not encouraged to, got confident, and thought it was
unnecessary to follow the procedures because it had worked before. This show, the organisation
relied on individual safety practices rather than a strong safety culture, which encouraged shortcuts
leading to routine violations. Due to lapses errors and possible knowledge-based mistakes more than
40% of the sprinklers would not have been able to discharge water due to poor maintenance.

The Offshore Installation Managers (OIM) did not make any NDM and did not feel empowered to
decide for themselves, so they used CDM and waited for instruction from shore because they were
ill-prepared for this scenario. As a result, they were indecisive, delayed, hesitant to take action, and
slow to stop pumping gas and oil to the Piper Alpha. The manager was in a state of shock, lost their
leadership skills, did not take control of the situation, and abandoned the control room, making a
slip error. This led to the crew and the managers making a knowledge-based mistake and waiting for
instructions that never came.

Conclusion
Although the manager made a methodical decision using CDM, it was hopeless because of the lack
of safety culture and routine violations. Lack of adequately training for the crew, work safety policies
and procedures, and communication; ultimately led to the demise of the Piper Alpha. Therefore, it
was common to have routine violations, lapses error, mistakes, and qualitative errors.

During the shift handover, incorrectly placed the Permit-To-Work document with a lack of
communication about the work, thus meaning that the correct procedures had not been followed.
There needs to be verbal and/or written exchange, which would have sufficed the information about
the missing valve. Having a written handover procedure with a satisfactory standard of information
should be added to the procedures and encouraged to avoid shortcuts leading to routine violations
and rule-based mistakes. This would make such a document easy to find and avoid a disaster like
this.

More training needs to be provided for the offshore managers, so they can keep their composure
and lead the crew to safety. The organisation should provide OIM regular training in joint exercises
involving multiple platforms, reducing the risk of slip errors and mistakes. They will be able to use
the CDM method to be more decisive and act fast without hesitance.

The knowledge-based mistake was inevitable with the crew because they were not adequately
trained. They did not even know where the life raft was located; to avoid any mistakes and lapse
errors, there must be a refresher to safety including emergencies and work procedures. Also, the
company's refusal to make changes to its flawed practices even when they were suggested by the
auditors was another mistake. The firefighting system was recommended to be left on automatic
when the divers are not going to be working within 5 meters as it was unnecessary and dangerous.
However, they continued to do flawed practices making exceptional errors. The safety culture needs
to encourage by reporting violations, complacency to procedures, actively involving the workforce,
actively doing a risk assessment, and making CDM improve the workplace. This also means putting
safety before production and not giving a reason to cut corners by improving the way that a set of
tasks, making realistic workloads and targets, and removing unrealistic procedures and any
unnecessary inconvenience.

Regular inspection and maintenance of the safety equipment such as firefighting systems sprinklers
and grated floors to make sure nothing is blocking holes, making the route to the water pump
machinery clear and can be reachable in an event of an emergency. The fire exit needs to be labelled
clearly and well-lit, so it is easy to access when the crew evacuates. The safety standard must not be
compromised, even if the platform is under major construction just in case of abnormalities and
emergencies. This includes retraining crews in that new environment, creating new procedures,
training in decision-making techniques (CDM and NDM), and the crew has the knowledge and
understanding of the system.

Reference list

McGinty, S. (2018) Fire in the night: The piper alpha disaster. London: Pan Books.

Cullen, W.D. (1991) The Public Inquiry into the piper alpha disaster. London: H.M.S.O.

Piper Alpha (2022) Human Factors 101. Available at:


https://humanfactors101.com/incidents/piper-alpha/ (Accessed: October 15, 2022).

Reid, M. (2020) The piper alpha disaster: A personal perspective ... - ACS publications, ACS
Chemical Health & Safety. Available at:
https://pubs.acs.org/doi/10.1021/acs.chas.9b00022 (Accessed: October 19, 2022).

Macleod, F. and Richardson, S. (2018) Piper Alpha: The disaster in detail, The Chemical
Engineer. Available at: https://www.thechemicalengineer.com/features/piper-alpha-the-
disaster-in-detail/ (Accessed: October 19, 2022).

Macleod, F. and Richardson, S., 2018. Piper Alpha-What have we learned. Loss Prevent.
Bull, 261, pp.3-9.

A process hazard analysis of the human factor: Piper alpha incident (no date) Anatomy of an
Incident. ABB. Available at: https://www.anatomyofanincident.com/incidents/piper-
alpha/piper-alpha-process-hazard-analysis-human-factor/ (Accessed: October 20,
2022).

Hoy, W.K. (2019) Decision-Making Theory. waynekhoy.com.

Aerospace Medicine, A.S. (2011) Decision making in aviation: Classical versus naturalistic:
Aviation medicine :: Aerospace medicine, Aviation Medicine :: Aerospace Medicine |
Deliberations for Living Healthy – Flying Safe. Available at:
http://www.avmed.in/2011/03/decision-making-in-aviation-classical-versus-
naturalistic/ (Accessed: October 20, 2022).

Human factors: Managing human failures (no date) Human factors/ergonomics – Managing
human failures. Available at:
https://www.hse.gov.uk/humanfactors/topics/humanfail.htm (Accessed: October 22,
2022).

Appendix

Figure 1 Classical decision-making action cycle. (Hoy. J.K, 2019)

You might also like