podcast series

an original podcast by SafetyCulture

We all accept a level of risk - in our lives, in our jobs. A baseline of risk is inevitable. But what happens when your baseline starts to shift? When you and everyone around you deviates from what is safe, bit by little bit?


04: Back to Consciousness

It’s been a long time since Piper exploded: experts are nervous that the industry has forgotten and is on the precipice of another disaster. So, can we ever be sure that the right warnings are getting through and should we all be trying to jumpstart our own chronic unease in order to keep us alive?


Jake Molloy: You wouldn’t jump in a car and drive 5,600 miles with a known fault on the brakes or a leak on the brake fluid or something that’s just putting you, your family, and the dog at risk. But major oil companies are prepared to tolerate this work around, this acceptance of risk based on their interpretation of operational risk assessment.

Claire Stewart: It’s pretty clear as soon as I meet Jake Malloy that he doesn’t beat around the bush.

That’s especially true when he’s talking about what oil and gas companies are doing these days to make sure their installations are risk free and safe to work on.

But you wouldn’t expect him to sugar coat it, given he’s a died in the wool union man.

Read full transcript >

Jake Molloy: You wouldn’t jump in a car and drive 5,600 miles with a known fault on the brakes or a leak on the brake fluid or something that’s just putting you, your family, and the dog at risk. But major oil companies are prepared to tolerate this work around, this acceptance of risk based on their interpretation of operational risk assessment.

Claire Stewart: It’s pretty clear as soon as I meet Jake Malloy that he doesn’t beat around the bush.

That’s especially true when he’s talking about what oil and gas companies are doing these days to make sure their installations are risk free and safe to work on.

But you wouldn’t expect him to sugar coat it, given he’s a died in the wool union man.

Jake: This is where it all falls down, this balance under our regulated structure which is you reduce risk to as low as reasonably practical. To me it’s an excuse for not doing the job. What this is, it’s mitigation to prevent an incident occurring as a consequence for something failing.

Claire: This is Baseline by SafetyCulture. I’m Claire Stewart.

Jake: Oh I remember it all as clear as day. Some of the best and worst times of my life offshore.

Claire: These days Jake’s a regional organiser for the offshore energy branch of the Rail, Maritime and Transport Workers union.

But before he moved into the unions full time, he did 17 years on the rigs.

Jake: In Aberdeen the only thing missing half the time was a tumbleweed. Lots of Americans wandering about in their big boots. A lot of bars down by the harbour down by the Quay site.  It was a mad old time but good fun as well there was a lot of camaraderie and everybody knew everybody because a lot of guys moved around a lot of different jobs, and you’d always bump into somebody that you knew.

Claire: He makes no bones about the fact the relationship between unions and operators has long been fraught, and more than occasionally combative.

But he’s definitely not the only one who’s worried about the growing trend for what the industry calls ‘operational’ risk assessments and lax safety processes.

The UK safety regulator also has big concerns:

Jake: When you dig down into the Health and Safety Executive’s review, you come to a section about topic performance, where the HSE inspectors rate the operators in terms of maintenance strategy and risk assessment, and a lot of elements.

The number that are moving from broadly compliant, which is not a great measurement in any case and if that’s what they’re striving for then God help us all, but there’s a number moving from broadly compliant into the poor, very poor category. You’re also seeing an increase in what the HSE term as non-compliance, something like a 60, 65 percent increase.

But the worrying one for us is that workforce reports, workforce concerns to the Health and Safety Executive have gone down by over 70 per cent.

Claire: Jake says it’s evidence that workers are now more worried than they used to be about reporting near misses and niggling concerns.

Because they don’t want a reputation for being a trouble-maker, or difficult, and they don’t wanna be fired.

2018 marks 30 years since the world’s worst offshore disaster, which killed 167 of the 228 men that were working on the Piper Alpha oil and gas rig in the North Sea.

Lord Cullen led the government inquiry into the disaster, six months of investigations and hearings. In the end, he handed down 106 recommendations which changed the industry overnight.

Stringent safety standards and better operational procedures and processes became law.

Because Piper was a disaster of cataclysmic proportions, governments and companies around the world took notice.


Lord Cullen: And I think there have been a change in attitudes towards safety certainly in the offshore sphere because people are much more conscious from top to bottom of companies that they've all got a responsibility they've got something to preserve in the way of safety.

Claire: The question is, thirty years on, have the lessons of Piper been forgotten? And if they have, what’s to be done about it?

Steve Rae: I’ve done my survival course seven times. How much has it changed? It’s got easier. It’s got less intense. What does that tell you?

Claire: Piper survivor Steve Rae spent nearly five years on the leadership team of an influential non profit industry body - called Step Change in Safety.

Steve: It tells you that we’re deviating away from ‘this is a scary business’ to, ‘we just need to do the training because it’s a certificate that you need to have’. And if you go down that route, how do you approach it in your mind? Why it’s just compliance. I just need to have a certificate to get offshore. Rather than, this stuff could save my life.

Claire: He says thing are better than they were.

But softer training requirements are just one sign that people are changing their attitude towards risk and are shifting away from that baseline of zero tolerance.

Steve: I think we’re in a place now and I genuinely believe this or I wouldn’t be in the industry, that we have a far more robust protocol for, dare I say, defending us against these potential risks and we’re better at capturing them on the way through so the likelihood of catastrophic failure is reduced or mitigated.

Of those 106 recommendations, many, many of them have made a huge difference. But ultimately, there is an individual decision to be made, whether you choose to follow them, or not. Or choose to police them, or not.

Claire: It’s what the UK’s regulatory body, the HSE has been warning about. It wrote to all offshore production operators in the North Sea in July.

It said that the number of gas leaks on rigs in the North Sea has been increasing, and some operators had come “perilously close to disaster” recently.

Steve: You can have the best process, procedures, protocols in place, but ultimately, there's an individual decision to be made, whether you follow them, or not. And that's the bit that you can't really legislate for.

Claire: Leading industry body, Oil and Gas UK, has acknowledged the regulator’s concerns.

It says it’s own data shows continued improvement across a range of trends, but that doesn’t mean operators can become complacent. It’s quoted as saying: “Since Piper Alpha, we’re all too aware of the personal and long-lasting consequences if things go wrong”.  

[Sounds of Melbourne trams, traffic lights, streets]

Claire: Across the other side of the world in Melbourne, I track down RMIT Associate Professor Jan Hayes. She’s a former chemical engineer who moved from working offshore at Esso into risk management and academia.

Jan remembers as a young engineer being asked to help formulate the Australian industry’s response to the Cullen report findings. It was a catalyst for her move out of industry and into safety research. 

These days she focuses on the organisational causes of accidents. Jan believes accident prevention ultimately comes down to leadership and operational decision making.

Jan Hayes: Really what we’ve seen in recent years is accidents caused by decisions made at the most senior levels of organisations often regarding cutting costs in times when the oil price falls or in other sectors when they’re under other pressures to increase profitability by reducing expenditure and so we’ve seen this trend where you might have senior members of organisations even right up to the board level where people don’t actually understand the potential for disaster.

Claire: Part of the difficulty is making people understand those potential disasters she talks about aren’t going to happen straight away.

Jan: It’s not like you cut the maintenance budget by 10 or 20 per cent you’re not going to have an accident tomorrow it might take a decade for that to come to fruition, but there are many accidents where you can draw a direct line between those kinds of organisational priorities and things that play out years later and it’s very easy to point to an operator that made a wrong decision on a particular day or a maintenance person who made an error but to think about the fact that they’re working in this environment where that error had such horrendous consequences because all the swiss cheese slices had all these massive holes that had been developing over decades and decades.

Claire: Piper survivor Geoff Bollands talks about the swiss cheese model too, which is when a number of unconnected, small errors happen to line up perfectly and result in a much bigger accident.

Jan says the only way to help stop that happening is by ensuring people and the organisations they work for exist in a state of what she calls “chronic unease” which helps avoid the small errors.

Jan: I think some people have that chronic unease despite whatever organisation they work in but you’re much more likely to have that as a way of being in an organisation if the senior people acknowledge that.

I used to run a lot of Haz ops workshops and risk type workshops for people and some of those things can be a bit of a drag in that they’re a very detailed way of going through a design and you’ve got a bunch of people in a room and you’ve got to maintain motivation to pay your best attention to stuff so I had this thing I used to pull out, you couldn’t do it too often, but I would basically tell people the story of what happened to me on West Kingfish.

What happened, was that as a junior chemical engineer Jan was working with two colleagues on the West Kingfish platform in Bass Strait off Victoria.

The three of them were at the spot on the platform where high and low pressure gas pipes meet. The company knew it wasn’t particularly safe so they’d actually been sent in to work out how to fix it.

Jan: We went offshore that day and we had no idea this was how it was going to end. 

Claire: It was a noisy part of the rig, too noisy to talk with her colleagues, but even so she thought something didn’t quite sound right.

She decided to walk round the corner to ask the operator for his opinion. As she did, the pipeline blew. One colleague was badly injured, the other one died.

Jan: So in a way the punchline would be there are no neon lights, there are no alarms that go off. This banal thing that we are doing today could prevent someone from dying in five years time and it deserves our best attention.

Certainly I mean to a very limited extent I went through that survivor guilt thing, if I’d gone and spoken to the operator earlier maybe I could have prevented the accident. That worried me for a little while but realistically I was so young and I didn’t know what was going on…

Claire: I ask Jan if events like that instil in a person the sense of chronic unease. She says, yes.

Jan: I don’t have any personal experience of this but I have heard others talk about the fact that there are now people moving into middle and upper middle management positions where they’re making important decisions who were not in the industry at the time of Piper, they’re too young and so they don’t have this chronic unease, they don’t have this sense of what can happen, the sense of coming to work one day and hearing that in our industry 167 people have died on a facility just like ours.

Claire: As a union chief Jake Molloy gets the juice on most of the issues offshore. Brent Bravo in 2003 was the worst incident he recalls in the North Sea since Piper.

Safety representatives on the rig were worried about a number of issues. Shell was the operator, and Jake says the company was listening but reluctant to intervene. The unions complained to the UK regulator.

                        Jake: HSE spent three, four months investigating and came back in August 2003 with a lengthy letter saying things could be improved upon, but essentially there was no imminent risk to anybody working on the Shell installation.

                        And three weeks later, there was over six tonnes of gas released into one of the legs on the installation. And two men, one of who I had done my safety training with, were killed. Asphyxiated by the gas. But that leg sat right below the accommodation block, and there were six tonnes of highly volatile gas, which with an ignition source would probably have collapsed the leg, and it would have taken the accommodation block into the sea with 155 men in it.

Claire: Jake is baffled by what happened on Brent Bravo. Why hadn’t the operator properly comprehended the danger they were putting rig workers in, given the location of the gas and its volatility?

He tells me about another near miss in 2017 where a non-manned installation had a significant gas leak. The issue was showing up on the control panels of the mother installation.

Jake: But the instruction was to just acknowledge it and move on. So they acknowledged the gas leak for four days. And then they decided they would send ten guys in a helicopter to the installation. And the helicopter landed, and the guys got off, and the helicopter took off, and the guys headed down to what would be the temporary safe refuge, the accommodation block in the event of an incident to find they couldn't get near it because it was in a cloud of gas. As was their means of escape. The lifeboat.

Claire: Jake says it was only the fact that the wind was blowing in the right direction that the helicopter didn’t explode as it landed on that platform.

Jake: Because a helicopter's engine would have ignited the gas and everybody on the helicopter would have perished were it not for the fact that the wind was blowing in the right direction.

Claire: He’s hearing less stories of near misses now and less complaints, but he says that doesn’t mean that things are rosy on the rigs, especially when the regulator’s data is showing a drop in safety compliance levels.

Lots of complaints, or near miss reports, is often a sign that there’s a solid respect for making safety a priority on a platform.

Jan’s research has shown that sharing stories about close calls and accidents is absolutely critical for building a culture of safety in organisations. And it can provide a gold mine of information for the company, if it knows how to use it.

Jan: It’s all about what happens with those near misses when they’re reported isn’t it. So if people are punished for reporting near misses then they’re going to stop reporting them but if people see action taken then, if people see that they’re used for trend analysis, or that they’re used for learning or shared in smart kind of ways, then the level of reporting can be enormous.

Claire: She did her doctoral research ten years ago at a nuclear power station that was getting 2000 near miss reports a month.

Jan: I mean it was a big site, there were a few hundred people working there, but they were getting that number of reports because the organisation was treating them like gold as a way of finding out what was going on in the organisation. They weren’t treating them as bad news they were treating them as excellent news and really useful and people got so into it they were reporting all sorts of stuff.

Claire: What’s changing now is the way the data is collected - not so much as numbers and statistics any more, but as shareable stories. And that’s having an impact on how we learn to plan and respond for disasters in the future.

Jan: if you’re looking for the rich picture of why did people do what they did and how did it make them feel and how is this going to help us in the medium term, then actually collecting the information as stories to share, is something that more and more organisations are trying to find ways of doing. These don’t all have to necessarily be bad news, they can be like a near miss can also be a story of how I recovered, this is what happened and this is what I did.

Claire: At the nuclear power station Jan studied, a group of operations people actually went above and beyond to build a story library of things happening to people during their shifts, to later test in their safety simulator at the site.


Jan: They would go off and run those scenarios through the simulator and play with them until they understood what was actually going on and could get a better feel for, ‘this is what happened and this is how we should respond to it next time’.

The problem with all that is that bosses have to make it clear they welcome having a culture of chronic unease, and prove it by welcoming bad news.

She says the way to do it is build an organisational structure that reports upwards from workers to management.

Jan: That also requires though, that senior people welcome bad news because if you’ve got senior people who only want to hear good news then they’re only going to get good news because no-one’s going to tell them the problems.

Claire: It’s what Jake means when he says workers are reticent to report concerns particularly in a downturn. The more worried they are for their jobs, the less likely they are to report.

And while people like Steve are optimistic the industry is about to experience a turnaround, with oil prices hitting four year highs, the pressure on costs is still being felt on the rigs.

Gina Sims, who founded the offshore women’s support group saw the impact first hand through her husband. 

Gina Sims: I certainly know Peter was only 60/61 when he, he'll be 62 this year, when he retired but I could see him coming home time after time because it had gone from being two on two off and then it was two on three off cause they took all their holidays away from them, so they gave them this extra week off. And then it was three on three off, so they're tired.

Peter worked offshore for 40 years, and they’ve both become progressively more nervous as rigs find ways to tighten their operating costs.

Gina: I mean Peter said "I'm getting fed up having to check things, I'm going on a job, or I've been sent to a job and I've sorted it when the last person that sorted it didn't do it right" and that's an accident waiting to happen. I could see bit by bit he was getting more stressed about going to work. In the end, he said to me "I think I'm going to retire" and I said "I think, I think that's the right decision for you to do". Because something will happen.

Claire: People still remember the horrors of Piper, but that chronic unease seems to have faded from the collective consciousness of the offshore oil and gas industry.

Jan says complacency can be countered through strong storytelling -- she’s researching how stories about air-crash disasters can be used to teach resources sector people about what can go wrong in their own industries.

Jan: It’s a matter of bringing it back to consciousness because as I said, there’s no neon sign flashing above the important decision, it’s kind of reminding people that if they work in this sector then things that they do every day could cumulate to cause a disaster. And it’s being mindful of that, I mean not incredibly hypervigilant or you couldn’t stay in the sector but just to have that in the back of your mind that everything you do is important.

Claire: I also asked everyone I met in Scotland how they would reduce the risks that come from people normalising unsafe practices, starting with Steve:

Steve: For me, that’s when I really thought about my influence, was how do you connect with the people who are really at the sharp end, and I made it and I will again make it my business to make sure they understand their role as individuals and I talk about, I don’t preach, I just talk about, individual responsibility, accountability, consequences, and that’s about getting individuals to understand whether it’s sitting at a desk or holding a spanner you can make a difference.

Claire: I think Steve’s right but changing the accountability individuals feel is a much a bigger challenge than changing protocols or regulations.

If you’re out on a rig, or in any workplace where there’s a couple of hundred people working, what is it that makes the difference between an individual worker stopping to fix a problem, or report a near miss, or speak up about something that worries them versus just walking past and assuming someone else will look after it? Maybe the answer is simply chronic unease?

No criminal charges were pursued against Occidental after Piper Alpha because there wasn’t enough conclusive evidence about the cause of the accident or who was liable.

Too many of the key witnesses were dead. I ask Lord Cullen whether he felt a disconnect between the evidence that he found and the consequences.

Lord Cullen: Your inquiry chairman like myself is not there to attribute blame but, but that doesn’t mean you don’t express criticism where you think it’s required.

Claire: How does someone separate the role of inquirer and judge and still be a spokesperson for safety, without becoming a little bit sceptical about whether we can ever successfully motivate whole workforces to take safety more seriously.

Lord Cullen: That's part of the art of leadership. I mean it's the more that you look at these events. I talked a moment ago about the triggering event and then underneath that perhaps you're digging down you get practices, procedures and so on but below all of that and supporting the whole thing should be the art of leadership, communicating a commitment to safety and things of that sort. There are passages I think in my report which touch on this but it's terribly, the more I read of accidents the more I believe that being absolutely critical.

Claire: But can the absence of leadership only ever be seen in hindsight?

Lord Cullen: I find it very hard to answer that question. I don't think it would show up for example on inspections or on audits conducted from outside. I suspect people who work in a particular industrial context become aware they're not getting the leadership they should get. But sadly this only is exposed if you have the time and the opportunity to dig down and find out what exactly has been going on, and sadly, that very much depends upon something bad something really bad happening.

One thing I would warn about is that there is a danger in adopting a tick box attitude to safety that perhaps comes through in the report. So if you're going to avoid that then you've got to be much more generally concerned with people's attitude towards safety and not simply what is prescribed but that it takes a lot of work and it requires determination.

Claire: I can’t help but ask about his own attitude to risk, given everything he’s seen. It turns out even talking about driving his car, Lord Cullen manages to slip in the occasional pearl of wisdom. 

Lord Cullen: I think as I get older increasingly cautious, over cautious - I'm thinking about driving I'm very very overcautious almost to to a - I was thinking the other day I was coming out a dangerous exit - I hate coming out because there's a corner around here and I  was looking to the right, that's ok, I was certainly looking to the left something was come up from the other side. Now you see that's the danger of being over cautious of preoccupying yourself with something and you suddenly blinding yourself to something that was there to see.

Claire: So, if we can’t ever stop complacency completely, can we mitigate the risk that comes with being preoccupied, or can organisations plan for the eventuality that process and safety systems will fail?

Trauma psychologist and disaster specialist David Alexander says he’s seen a lot of companies and organisations present accident and crisis plans they think are spectacular, but which are actually next to useless. 

David: There is an issue for planners to think through what I call "Think through to destruction." I mean, any plan you have to test the destruction. Get guys, not your employees, your juniors. Get guys are going to say "I wouldn't do that because you've got a helipad above you." "I wouldn't do that because you're right next to a riser."

You must think I'm terribly sceptical and cynical, but I don't think I am actually. I've just seen a lot of it. Some people will come up with a very good plan. But they then tend to get their own HSE colleagues to test it. Now that's fine. They are experts in their own right. I would get other guys, it might be guys who actually work in that environment - not to be immodest - it might be trauma specialists, who'd say, "whoa whoa, what are you going to do if somebody collapses with a heart attack? Do you leave them? CPR?" And you should really have plan A, plan B, and I would say plan C. The Japanese produced a wonderful plan for an earthquake this wonderful Plan A, it didn't work. And there was no plan B.

Claire: It all sounds a bit more gung ho than I expect most companies would be used to. I know David’s comfortable advising on disasters in war zones and hostage negotiations but I’m not sure if you can draw a correlation between that kind of thinking and the way you’d approach a company’s safety plan. 

David: It's difficult, believe me. This is not to be pejorative, but they do respond to matters relating to money. And if I quote to them the sums of money which were required after Piper or other major disasters, and I'll say "needing to spend a million and a quarter of pounds on an exercise is about a third of what you have to pay one man who lost a leg." I know that's rather crude kind of logic, but I have to think the language there so they're thinking.

Claire: And, he says, keep planning. Test constantly.

David: Because safety standards, if nothing happens, safety standards decline. When you've passed your driving test, you know, better signal, manoeuvre, but after ten years it's the old texting away, having a fag.

So we're all culpable, and I'm culpable too except when I'm on duty as it were. Standards do decline, so you've got to keep testing them. Have a false alarm. Boy, I can tell you that stirs you up. There's an expression I call overlearning. Don't just do it once. Do it until they'll actually be saturated with it. Otherwise you revert to type in nature.

Claire: In the offshore industry at least, those like Steve Rae and Lord Cullen are confident workers are much safer because of what was learned from Piper. 

We might have an explosion or we might have a gas leak or we may have a structural failure but it will be contained, I’d like to think.

Jan Hayes was working offshore for Esso on that day in 1988. She says the impact for her was not so much hearing the news, but watching the faces of people who had worked in the industry for years, as they heard what had happened.

Her Baseline had already been set, two years earlier, when she was involved in the accident on the West Kingfisher rig that killed her colleague.

That incident, and her role implementing some of the Cullen Inquiry recommendations in Australia shaped how she thinks about safety, and risk.

But, as time moves on disasters, and accidents, and near misses lose their impact as routines and complacency creep back in and people naturally start to normalise deviance.

Jan says huge amounts of money were spent around the world after Piper on improving blast protection and safety equipment after Piper, but people forgot there’s a whole lot that can be done to make workforce processes and operations safer.

Jan: These days accident investigations do tend to understand a model of accident causation that links more to organisational circumstances and follow asking why, five times, so you know these individuals didn’t come to work intending to have an accident so why did they do what they did and looking at those circumstances behind it.

Claire: What I’ve learned doing this podcast is that if, or when, tiny deviations away from the baseline of safety all line-up, it can take down a thirteen story oil rig. Or a space shuttle. Or a warehouse. Or a suburban home.

The challenge is recognising when we’ve deviated, and for the oil and gas sector at least,  industry experts say the warning signs are now there.

Jake: So you got an industry that's performing worse, you've got an increase in non-compliance so that the real terms that should be more safety concerns, there's been a 70 percent drop in reporting. That's not good. That should be seen as a very, very worrying trend. Because if workers aren't reporting to HSE, even using that hotline, and you've got those other elements, and you're getting very, very close. And HSE has said that we've come very, very close a couple of times.

Claire: Part of what Steve and Jan, and Geoff and Lord Cullen work on now is finding ways to impart on people and organisations the feeling of chronic unease, to change the way we think about risk.

Jan: Because often people assume they know why someone or a group of people is behaving in a way they don’t expect and they think the problem is those people, and if only those other people behaved in a different way it would all be fine. But you may find that in ways you don’t intend they may be taking their cues from you. And it might be that you’re saying things but they don’t see the way you act as being consistent with what you’re saying.

It makes me think of something I found when I was researching Steve, and Piper. He’d written it in a LinkedIn article from a few years ago:

“You are free to choose, but you’re not free from the consequences of your choice.”


This has been Baseline by SafetyCulture.

For more information on the iAuditor app and how SafetyCulture helps businesses identify what’s working and what’s not in their own operations, visit www.safetyculture.com

Baseline is produced for SafetyCulture by Audiocraft’s Jess O’Callaghan and by me, Claire Stewart. Sound design by Tegan Nicholls and original music written and performed by Kerryn Joyce and Kirsty McCahon.

Thanks also to Pauline Hailstones, Lord and Lady Cullen, Steve Rae, David Alexander, Geoff Bolands, Gina Sims, Jake Maloy and Jan Hayes for their gracious assistance.