Challenges

Challenges

They slipped the surly bonds of earth to touch the face of God.

Those words from the poem, High Flight by John Magee, were said by President Reagan on the afternoon of Tuesday 28 January 1986. They ended his address to the nation after the Challenger Shuttle exploded barely 2 minutes after its launch earlier that day.

The seven astronauts on board did not disappear into the sky. As described in Adam Higginbotham’s magisterial, detailed and gripping account of the tragedy and its aftermath – “Challenger: A True Story of Heroism and Disaster on the Edge of Space”, the cabin where they were seated did not break up but plunged two minutes 45 seconds later into the ocean where it was recovered from the ocean floor along with what remained of the astronauts and their personal effects. As recounted in the final chapters of this book, the extraordinarily detailed recovery efforts and forensic investigations revealed that at least three of the astronauts had been operating their personal Egress Air Packs (designed for ground emergencies only), containing air supply connected to their helmets. One of them had been breathing from it for the length of time it took for the cabin to fall into the ocean. As the author writes, not only had the astronaut “been conscious when he began his long descent toward the ocean, but … he had been busy, working through every procedure he could think of as he fell. He was nobody’s fool; he knew he was going to die. But he never stopped trying to live.

It is one of many details which makes the book so absorbing to read but must have been unbearable for the families to learn.

The astronauts knew the risks of space flight. But they did not know – and were not told – of the very specific risks they were running on this flight. What makes the book so fascinating – despite us knowing what happened and why – is its minute-by-minute account of the decision-making process in the hours leading up the flight. The night before the flight (already delayed by a few days by unexpectedly cold weather) the engineers working for the manufacturer of the rocket boosters, Morton Thiokol, had said the flight should not go ahead because the very cold temperatures meant the rubber O-rings designed to seal the joins in the booster rockets during ignition would not work as they should. They knew this – not just from theory but because they had been repeatedly compromised on previous flights. (The engineers had been trying to find a solution but had not yet succeeded and there had been repeated memos explaining the problems and risks, all of which were largely ignored by the complicated and extensive process NASA had put in place for assessing the viability of its projects and specific flights.) What raised the risks were the freezing temperatures which meant – because extreme cold prevents rubber from stretching as needed – the seals failed long enough for highly flammable fuel to leak leading to what NASA’s commentator, after an agonising wait, called a “major malfunction”. But what turned this flight into a tragedy was the decision of Morton Thiokol’s managers to change the original advice to NASA from “Do Not Launch” to “Go” in little over an hour.

Why was the engineers’ advice overridden? The usual reasons: the more senior managers did not want to be responsible for a failure to launch a flight on which so much depended – for itself and for its main client – NASA. So they rationalised away the engineers’ concerns (there was no conclusive proof that if the seals were damaged, disaster would result – a bizarre approach to assessing a risk since if there were such proof, there would be no decision to take) and simply ignored it, giving NASA the answer they knew it wanted to hear.

As in Challenger, so in many other tragedies.

Even after this visible failure, Morton Thiokol senior managers sought to mislead the Congressional Commission about what had gone wrong. Not just to protect themselves but their most valuable customer. It was only because the senior engineer present at the hearing (Allan McDonald), realising what was happening, had to jump up and down waving his hand furiously to catch the panel’s attention, that his evidence was finally heard and the full story came out. The accident was not simply a technical failure. It was caused by years of NASA and its manufacturers not understanding the specific risks they were running with their design of the rocket boosters and the rubber rings in particular and – because of this – the final catastrophic failure of the launch approval process. Human decisions and human errors, in short.

The final report was, as is all too familiar, damning: cost-cutting, faulty design, management blunders, repeated warnings ignored, institutional hubris, safety and quality assurance structures weakened. As one of the Commission’s members wrote: “For a successful technology, reality must take precedence over public relations, for nature cannot be fooled.” NASA was described as having exaggerated “the reliability of its product to the point of fantasy” to continue getting government money: a finding echoed over the decades in very different sectors.

What might those not involved in space travel learn from this story?

1.     Controls / processes / systems are only as strong as at their weakest point. Any part of a system can become its weakest link if it is not properly understood: either why it exists, how it works – or is meant to – and what happens when it doesn’t.

2.     Understanding the risks you run is essential if effective action is to be taken to eliminate or mitigate them. NASA’s very success in establishing and running a space programme at all blinded it to the unnecessary risks it was running with these rocket boosters. How could a rubber ring similar to that found in most taps be allowed to hold up the astonishing achievement of space travel? But – are you listening Post Office executives, indeed all executives? – just because something works – or appears to – most of the time is no reason to ignore warnings about what happens – or might happen – when it doesn’t.

3.     Successful systems contain the seeds of their own failure, if pushed too far. NASA was by far Morton Thiokol’s biggest customer. Morton Thiokol had every incentive to do the best possible job. But what that meant was that it also had an incentive to ignore or cover up failings which threatened that lucrative business. What NASA wanted, it got. But it was not always what it needed.

Understanding and identifying the point at which a good system or control tips over into creating the very mischief it is seeking to avoid – or a risk no-one has anticipated – is the essence of all risk management. It is not so much a question of having or not having a control or rule but of understanding how it works, how the humans operating or responding to it behave and how that behaviour may itself undermine the control or lead to unexpected or unwelcome outcomes.

Ah – humans.

What happened to those engineers and managers who spoke up explains why it is often so hard to do so when much is at stake. Allan MacDonald knew that if he told the Commission what happened the night before the launch, he would damage NASA, his employers, his colleagues’ jobs and his own career. He went ahead anyway. When he and other engineers who gave evidence returned to their jobs, they were shunned by their colleagues and managers. So badly were they treated that they called themselves the Five Lepers. It took intervention by Congress for them to get their jobs back. It takes guts to speak up in such circumstances. Few people can – or are willing to – withstand the consequences of being made to feel a pariah by colleagues and bosses. A beautifully written law or procedure about no retaliation is cold comfort when you hear a colleague shouting furiously: “If I lose my job because you guys couldn’t resist testifying, then I’m going to dump my kids on your doorstep.”

If we want evidence of this more recently and closer to home, listen to the evidence given by junior employees of Kingspan, Celotex and Arconic, manufacturers of the dangerous cladding and insulation used on Grenfell Tower, to the Moore-Bick Inquiry (into the fatal 2017 fire), which found those companies to have deliberately deceived customers and the regulatory authorities about the safety of their products. Despite their concerns, they felt unable to challenge senior management, either because doing so wasn’t “doing any benefit to my career” or they were “embroiled in the culture of the business and it becomes second nature” or because they didn’t know who to speak to or because they “lacked the, I guess, the life experience to find the right way forward and it was – it was a failure of courage and a failure of character and a failure of moral fibre on my part not to do so.”

What they said, what happened to the Morton Thiokol employees nearly 40 years ago says more clearly than anything else what a toxic culture looks like. It is not just a culture which puts commercial interests above ethical ones. It is not just a culture which seeks to break the law or take unnecessary risks. It is not just a culture which seeks to mislead those who put their trust in its people and products. It is also a culture which puts the most junior employee in positions where they feel unable to do the right thing, which encourages and rewards them for doing the wrong thing then leaves them exposed when the consequences of the decisions taken at senior levels are cruelly revealed.

And finally – note how quickly the Commission was able to report – 4 months after the disaster. Contrast this with the years – often decades – needed for the many public inquiries Britain now has. Would it be unkind to suggest that they are a magnificent tool for the British state to appear to be doing something while in reality making sure that nothing of substance gets done at all? Potemkin justice: a magnificently expensive, intricately detailed facade, while behind it the systems and behaviours which led to the scandals continue as before, the people responsible for what went wrong get away with it and, as has happened countless times before, the recommendations are not acted on but neatly filed away, as a historical record of what might have been – if only death and weariness had not taken over. The process working exactly as intended, in other words. Or is that too cynical?

Whatever your thoughts on that, this book is well worth reading.

Cyclefree

Comments are closed.