When things go wrong, we seem to display a reliable tendency to do one thing: blame those at the ‘sharp end’. No matter how complex the system, how uncertain the situation, or how inadequate the conditions, our attention post-accident seems to turn to those proximal to the consequence, whom we judge to have failed to control the hazard in question. Whether the response is formal (such as company disciplinary proceedings) or informal (such as shunning or gossip), scapegoating allows us to ignore the parts that we play, as individuals, organisations, and society, including the heavy demands we make on individuals. It has probably been this way since time immemorial.
The notion of ‘just culture’ has developed over the past decade or so in response to this state of affairs. Just culture describes a collective agreement that people are not punished for their so called ‘honest mistakes’ (which begs the question, of course, what dishonest mistakes are). But gross negligence, willful violations and destructive acts are not tolerated. Nobody goes to work to have an accident (by definition), and only rarely do people act with gross negligence at work, which in any case becomes an issue for the justice system.
Just culture is highly valued by front line staff. They are the last ones to ‘touch the system’ before it trips over itself, and it is they who are most often held responsible when things go wrong. Those who make decisions about system conditions are usually not subject to anything like the same level of scrutiny, and nor do they usually have as much as stake. If you put yourself in the position of a surgeon or anaesthetist, a pilot or air traffic controller, a power plant control room operator or maintenance technician, you would probably welcome a genuine commitment from your company, and the national judiciary, to fair treatment when things go wrong. You would probably want your actions and decisions to be viewed in the context of the situation and the system – not in isolation. Admittedly, the term ‘just culture’ is not ideal. But we are where we are, and the fundamental idea is rooted in the minds of many as a good thing.
Just culture is, however, borne of the Safety-I mindset. According to Erik Hollnagel (EUROCONTROL, 2013; Hollnagel, 2014), Safety-I views the person predominantly as a hazard or liability. From this viewpoint, it might seem to make sense to blame and even punish individuals when things go wrong, following a post-mortem of their ‘errors’ and ‘violations’. However, this mindset and response only serves to ensure that people cover up their actions. In practice, this might mean not reporting adverse events, distorting the facts, colluding with others, retreating into roles, and so on. To address this, some organisations have published and enacted just culture policies, liaised with unions and associations, and even held dialogues with the judiciary to reach an understanding of the human contribution to safe operations.
Since the advent of ‘just culture’, the Safety-II perspective has emerged. Safety-II defines safety not as avoiding that things go wrong but as ensuring that things go right. Safety-II views the human not as a hazard, but as a resource necessary for system flexibility and resilience. Safety-II recognises that performance adjustments, performance variability and trade-offs are not only unavoidable and inevitable, but vital to system safety, because system conditions cannot be specified exactly or assured with certainty. People need to make the imperfect system work as a whole.
In light of this, it has been proposed that the idea of just culture should be abandoned. If we take a Safety-II view, ‘just culture’ might indeed seem unnecessary. However, this proposal can only work in a collective mindset shaped by the Safety-II perspective. The world-as-found, however, is one dominated by the Safety-I perspective, and this is likely to be the case for quite some time, and a parallel ‘Justice-I’ perspective will sit alongside this for the foreseeable future. Furthermore we all remain humans and will retain the very ‘Human-I’ instinct to blame the imperfect other for unwanted events (both the out-group other – front-line workers for managers; managers for front-line workers, and also the in-group other – fellow colleagues who did something that you think was substandard and that you would have done better).
For these reasons, just culture will and must remain on the agenda, and just culture – for want of a better name – is one of the Systems Thinking for Safety: Ten Principles. While we seek to move toward Safety-II, we remain in a Safety-I paradigm. And even within a Safety-II paradigm, it is hard to imagine that humans would not revert to blame.
But the nature of the concept should evolve. Rather than focusing only on justice, or even fairness, just culture should focus on a mindset of trust, mutual understanding and openness, as well as language that is non-blaming (see here). This should apply not only ‘vertically’ (e.g. between managers and workers); it should apply between all of us. Assuming goodwill should not be only a response to adverse events, but a baseline assumption, especially when things don’t go our way. Whatever our view of the human – as hazard or resource – just culture reminds us that we are human, and that we need to be mindful of our reactions to failure.
Here some operational staff talk about just culture in their own words.
EUROCONTROL (2013). From Safety-I to Safety-II (A White Paper). Brussels.
EUROCONTROL (2014). Systems Thinking for Safety: Ten Principles (A White Paper). Brussels.
Hollnagel, E. (2014). Safety-I and Safety-II. The past and future of safety management. Ashgate.
https://humanisticsystems.com/2014/09/27/safety-human-performance-system-from-theory-to-practice/ https://humanisticsystems.com/2014/09/27/systems-thinking-for-safety-ten-principles/ http://skybrary.aero/index.php/Toolkit:Systems_Thinking_for_Safety/Principle_3._Just_Culture http://www.skybrary.aero/index.php/Just_Culture