Why Is It Just So Difficult? Barriers to ‘Just Culture’ in the Real World

This article is a reproduction of an article published in HindSight magazine issue 35 in September 2023 (all issues available at SKYbrary)

At the heart of Just Culture lies a simple acknowledgment: we all make mistakes. Sometimes we forget things, we don’t see or hear things, we misperceive and misinterpret things, we misjudge things, we make decisions that do not fit the evolving situation, we do or say things that we didn’t mean to do or say. We all do this, in the living room, in the ops room, in the board room, even in the court room. None of us is immune. These unwanted moments are a great leveller.

So how can we judge people for making mistakes – for being human? No mistake should be sufficient to instigate a disaster. Systems that require perfect performance by human controllers are bad systems, because they deny nature. Complex, safety-critical systems should be highly defended from normal variability in the workings of the head and hands.

But sometimes, it is easy for things to go disastrously wrong. And so this quandary remains difficult to reconcile. My interest in this issue stems back to the late 1990s as a young psychology student. I eventually completed my doctorate on the topic twenty years ago. I consulted hundreds of academic papers, analysed hundreds of incident reports, and spent hundreds of hours in control rooms and simulators, observing and interviewing controllers. What do these brain blips have in common?

At that time, with my psychologist’s perspective on ‘cognitive errors’, what they had in common was a deviation from one’s own intentions and expectations. But for other stakeholders, what they had in common was deviation from others’ expectations and requirements, including those of other professionals, organisations, the criminal justice system, the media, and citizens. I increasingly became uncomfortable. “Human error” was used by many to infer cause and culpability. This made everything more complicated. And especially when it comes to decision-making and habits, we then enter the realm of conduct and practice. But right and wrong are not black and white.

In the last decade or so, my colleagues and I have spent over 30 weeks with controllers, engineers, managers, safety specialists, and others in air navigation service providers in over 30 countries, talking about Just Culture and safety culture in workshops. Together with colleagues, I have also worked with prosecutors and judges along with pilots and controllers. In a patient safety context, I have collaborated on approaches to Just Culture within healthcare, given and heard evidence to a committee meeting in the UK Houses of Parliament, and given evidence at a hearing for a review on Gross Negligence Manslaughter.

The perspectives I gained during this time are so numerous, diverse, and intermingled that it is not possible to do justice to them. But what emerged are many barriers to Just Culture. These are what makes it so difficult. So, that is the focus of this article. For each kind of barrier, a whole book could be written, but I hope that the sketch below gives an impression of some of the barriers that we need to talk about if we are to make progress.

Conceptual Barriers

We may try to achieve a common culture across the organisation, but you can’t ‘design’, ‘engineer’ or ‘implement’ a culture of any kind.

Just Culture is defined in Regulation (EU) No 376/2014 as “A culture where staff are not punished for actions, omissions, suggestions, or decisions taken by them that are commensurate with their experience and training, but where gross negligence, wilful violations and destructive acts will not be tolerated.” But ‘Just Culture’ is not really a culture per se, or even a subculture. It is a trope – a figure of speech or recurring theme. It puts a focus on a particular value – justice – within a culture. Just Culture is a reason to have a conversation. An organisation may have supporting policies and processes, and there may be overarching regulation, but a conversation is needed to uncover how we think and act. Different groups (with different subcultures) have different ideas and ideals.

We may try to achieve a common culture across the organisation, but you can’t ‘design’, ‘engineer’ or ‘implement’ a culture of any kind. Unfortunately (or fortunately, depending on your perspective) culture is largely read-only/write- protected. There is change, but adaptive change is mostly bottom up, and slow. True cultural change means changing shared values, beliefs, assumptions, and practice. That’s hard enough for one person trying his or her best! For a thousand people…? Good luck. So, culture change is not usually centrally directed or top down. Culture change is evolutionary – more glacial than galloping – as groups learn and pass on lessons for their survival. But safety and justice are important values, and the notion of ‘Just Culture’ helps to trigger conversations about them.

Personal and Social Barriers

Our judgement of performance is affected by the severity of the outcome, hindsight, and who is affected.

Whatever our culture, we are all different. We have different values, beliefs, attitudes, and habits. When it comes to justice and fairness, we also see the world very differently. Some people accept the ‘just world hypothesis’, and assume that a person’s actions inherently bring morally fair consequences to that person. And people have different attitudes to mistakes. Some are unforgiving, and see even rare mistakes as a sign of incompetence. Punishment is often seen as a useful corrective measure. Most of us have this attitude in some circumstances. If it is your relative who is harmed by a distracted driver or an overconfident surgeon, your perception of justice will tend to differ compared to when an unknown person is harmed. Our judgement of performance is affected by the severity of the outcome, hindsight, and who is affected.

Importantly, the Just Culture ideal is built on trust, and trust is fragile. In an organisation, it takes a long time to develop confidence that one will not be punished for mistakes that constitute normal human variability, and this trust is rapidly eroded. A change of manager to one who is unsympathetic to the reality of work-as-done can undo a lot of work on Just Culture. This fragility highlights once again that Just Culture isn’t a ‘culture’, as such; it’s an agreement.

Linguistic Barriers

Philosopher Ludwig Wittgenstein wrote that “the limits of my language mean the limits of my world. All I know is what I have words for.” The form of something, even the very existence of it, depends to a large degree on the words we have to describe it. In this sense, words shape worlds (Shorrock, 2013). Our safety lexicon is not neutral, and certainly not positive. This shapes a deficit-based way of thinking, which further reinforces deficit-based language. If you think about the words associated with safety management, for instance as might be found in the glossary of a safety report, you’ll find a negative tone: accident, cause, danger, error, failure, harm, hazard, incident, loss, mistake, near miss, negligence, risk, severity, violation. You’ll find relatively few words to describe how safety is created, and those that one finds are rarely ‘human’ (e.g., barriers, redundancy). The same goes for taxonomies used for incident analysis. Again, the terms are routinely negative (e.g., poor teamwork, inadequate supervision), reinforcing a human-as-hazard perspective. (They could just as easily be neutral, e.g., teamwork, supervision.)

To make matters worse, slogans such as ‘zero accidents’ and ‘never events’ send messages that undermine safety and justice (Shorrock, 2014). For doctors, ‘First, do no harm’ is a commonly cited principle. It is often misunderstood as ‘zero harm’, when it originally meant ‘abstaining’ from intentional wrongdoing, mischief and injustice. It did not refer to mistakes. We might see it as an early line in the sand.

Professional and Organisational Barriers

“Our ideas about justice and the acceptability of occupational conduct are deeply ingrained in our own professional background.”

Different professions have different ideas about justice and associated issues such as mistakes, competency, and negligence. There can be striking differences between operational and engineering staff, for instance. For engineers, there tend to be fewer shades of grey in both procedure and practice. But professionals – with insider knowledge and high expectations – can be the harshest critics of their peers. We tend to fear the judgement of our peers the most, but we coalesce to repel the judgement of external parties, such as managers or prosecutors. This is valid in a sense, because external parties don’t understand the work. (Whether we want them to understand the work or not, depends on how we imagine the outcome of their judgement.)

Each profession – operational, HR, legal, safety, regulation – also takes comfort from its own form of déformation professionnelle, and experiences ‘trained incapacity’ (see Shorrock, 2013). Our professional experience deforms the way we see the world, at least to other people outside of our occupational clique, and even incapacitates us. It creates differences in how the same decisions and conduct are viewed in retrospect. Our ideas about justice and the acceptability of occupational conduct are deeply ingrained in our own professional background. Some acts are deemed unacceptable a priori. Organisations sometimes give examples. These usually involve illegal use of alcohol and drugs, as well as forgery or falsification. But in the middle lies a grey area of conduct. Some organisations adopt engineering- style flowcharts to help navigate this, which may be a good starting point, but may also reflect our stage of maturity when it comes to conversations about practice.

Historical Barriers

“When someone is blamed for an ‘honest mistake’, it is like a social oil spill. The pollution sticks around for a long time.”

Organisations have a history, which includes unwanted events and how people are treated following such events. People in organisations have a memory of these events, which influences their beliefs about the future. How will I be treated if I make a mistake and things turn out badly? It makes sense to consider how others were treated in similar circumstances.

If someone was previously treated unfairly, this influences how I think, feel, and act. Interestingly, memory of previous episodes is somewhat independent of whether a person was even in the organisation at the time. It is encoded in organisational folklore, passed on from member to member, and so influences behaviour even for those who were not part of the history. When someone is blamed for an ‘honest mistake’, it is like a social oil spill. The pollution sticks around for a long time. It remains even after the judging person has left the organisation. Ironically, mistakes in handling others’ mistakes are among the least readily forgiven by groups of professionals who find themselves under the spotlight. The clean-up operation can take a generation unless apologies and amends come quickly, and they rarely do.

Regulatory Barriers

Regulations are infused with messages – explicit and implicit – about ‘safety’, ‘justice’, and ‘acceptability’, even if the words aren’t used. The provisions and articles are not always consistent or compatible. This is partly because of the huge effort required to do so thoroughly. Constraints on regulatory resources mean that an efficient solution is chosen instead – leave people to interpret the regulation and resolve vagaries and inconsistencies. In the now-famous definition of Just Culture in EU 376/2014, we are let to define for ourselves what is meant by “gross negligence” and “wilful violations”.

We need to interpret what is meant by “actions, omissions or decisions taken by them [frontline operators or others] that are commensurate with their experience and training”. And who are the “frontline operators” and “others”? The confusion at least reinforces the point that ‘just culture’ is an idea and a reason for a conversation, not a thing that exists out there in the world.

Technological Barriers

“Technology can make it easy for things to go catastrophically wrong.”

Technology can make it easy for things to go catastrophically wrong. We somehow accept this for some technologies (e.g., trucks, buses, cars), partly because they offer convenience that we value more than the risk of harm. We do not accept it for other technologies, but still it happens. Spain’s worst train crash in over 40 years is testament to this. The derailment happened 10 years ago on 24 July 2013, when a high-speed train travelling from Madrid to Ferrol, in the north-west of Spain, derailed on a curve four kilometres from the railway station at Santiago de Compostela. Eighty people died. The train was travelling at over twice the posted speed limit of 80 kilometres per hour when it entered a curve on the track. The technological system allowed this to happen. Neither the passengers nor the driver was protected, but “human error” by the driver was blamed in the aftermath (see Shorrock, 2013). Ten years later and the trial remains ongoing. There are other examples of how ‘simple mistakes’ – of the kind that anyone can make – precede disaster. The real mistake is the failure to mitigate inevitabilities.

Legal and Judicial Barriers

Whatever the attitudes to safety and justice inside an organisation, organisations operate in a legal context. Naïve ideas about not punishing innocent mistakes may collide at speed into reality once a prosecution commences. In many civil law jurisdictions, prosecutors lack the discretion as to whether to file charges and how to present a case. So unintended ‘honest mistakes’ may well be criminally relevant acts of negligence that must be prosecuted according to the penal code. (In this context, incidentally, the famous question, “who draws the line?” is easily answered: a judge or jury.) In a common law context in England, Wales and Northern Ireland, ‘Gross Negligence Manslaughter’ applies to deaths in a workplace of any nature. What is interesting is that the degree of negligence needs to be “very high”, and conduct must “fall so far below the standard to be expected of a reasonably competent and careful [person in the defendant’s position] that it was something truly, exceptionally bad.”

But we also have to grapple with our confused and inconsistent standards when it comes to legal action. An ordinary driver who displays essentially the same behaviour as a train driver, professional pilot, or air traffic controller, will be judged quite differently, also depending on the outcome. We commonly agree that faults in driving ought to be punished. We even have specific laws for driving conduct. Again, in England, Wales, and Northern Ireland, driving offences mainly fall under two categories: dangerous driving, and careless or inconsiderate driving. Dangerous driving includes obvious things such as racing and ignoring traffic lights, but also using a hand-held phone or other equipment, looking at a map, talking to and looking at a passenger, or selecting music. Careless driving, or driving without due care and attention, is committed when driving falls below the minimum standard expected of a competent and careful driver, such as unnecessarily slow driving or braking, dazzling other drivers with un-dipped headlights, or turning into the path of another vehicle. What is an ‘honest mistake’ depends on the context and the outcome.

Societal Barriers

“‘Just Culture’ is entangled in a struggle with the pervasive fear that that we have created systems that can fail catastrophically, albeit very rarely, seemingly as a result of ordinary and inevitable human variability.”

‘Just Culture’ is entangled in a struggle with the pervasive fear that that we have created systems that can fail catastrophically, albeit very rarely, seemingly as a result of ordinary and inevitable human variability. Complex systems have a terrifying habit of operating efficiently close to a tipping point into failure. Professionals whose contributions are closest to that tipping point become the target for the dual fear response of anger and blame. In psychology, this is known as ‘displacement’. Despite being set up to fail, there is simply no one else who is convenient to blame in the heat of the moment. Headlines of “human error causes accident” mirror our appetite for simple, low context, low complexity explanations that come with a scapegoat upon which to offload our anxiety about what we’ve created.

Evolutionary Barriers

Our sense of justice is not unique to modern humans. We have inherited it from our primitive ancestors. This can be seen in our closest relatives: chimpanzees discipline greedy peers who cheat or are otherwise uncooperative. Other mammals administer justice in groups for breaches of social norms. Some group norms are essential for group survival and so deviations will not be tolerated. But our evolution has hamstrung our thinking about justice. We make simple-to-complex reasoning errors; our thinking and internal reactions about simple situations are transferred to unwanted events in complex situations. But for complex, high-hazard socio-technical systems that need to be defended heavily from the effects of simple mistakes, this thinking and feeling is misplaced.

So, What Can We Do?

“Systems should be designed – so far as is reasonably practicable – to prevent catastrophic outcomes.”

It seems that we are in a phase of confusion. We are trying to work things out. Acknowledging this is a good first step. Perhaps we can accept, though, that people make genuine mistakes, all the time. And sometimes – but quite rarely – conduct really is unacceptable. Using the words of retired English judge Sir Brian Henry Leveson, who served as the President of the Queen’s Bench Division and Head of Criminal Justice, we must sometimes identify “the line that separates even serious or very serious mistakes or lapses, from conduct which was truly exceptionally bad”. This was directed at gross negligence manslaughter, but removing that fatal outcome, it seems reasonable to apply this more generally when it comes to corrective justice. And remember that
the term ‘serious mistakes’ does not necessarily refer to outcome: systems should be designed – so far as is reasonably practicable – to prevent catastrophic outcomes. Complex, high-hazard systems such as transportation, healthcare, and power generation must be defended from the effects of such mistakes. If it is easy for things to go disastrously wrong, that is a more fundamental mistake of design and management.

And many are harmed in some way when things go wrong. So, we should seek to identify who is impacted, understand their needs, and help to meet those needs. This is the essence of restorative just culture, which has additional complications (for instance, those who are impacted may express a need for retributive justice).

By reflecting on our own reactions to failure, and how we contribute to creating, maintaining and overcoming each of the barriers to Just Culture, we can genuinely do our part for justice at work, at home, and in society more generally. This way, even though unwanted events will always be hard to handle, there may be fewer barriers to learning and healing from them.

References

Shorrock, S. (2013, December 12). Déformation professionnelle: How profession distorts perspective. Humanistic Systems. https://bit.ly/HSDefPro

Shorrock, S. (2013). Human error: The handicap of human factors, safety and justice. HindSight, 18, 32-37. https://www.skybrary.aero/articles/hindsight-18

Shorrock, S. (2016, February 27). Never/zero thinking. Humanistic Systems. https://bit.ly/HSNZT

Author: stevenshorrock

This blog is written by Dr Steven Shorrock. I am an interdisciplinary humanistic, systems and design practitioner interested in work and life from multiple perspectives. My main interest is human functioning and system behaviour, in work and life generally. I am a Chartered Ergonomist and Human Factors Specialist with the CIEHF and a Chartered Psychologist with the British Psychological Society. I work as a human factors practitioner and psychologist in safety critical industries. I am also an Adjunct Associate Professor at University of the Sunshine Coast, Centre for Human Factors & Sociotechnical Systems. I blog in a personal capacity. Views expressed here are mine and not those of any affiliated organisation, unless stated otherwise. LinkedIn: www.linkedin.com/in/steveshorrock/ Email: contact[at]humanisticsystems[dot]com

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.