System Safety: Seven Foes of Explanation

In this short series, I highlight seven foes and seven friends of system safety, both for explanation and intervention. Each is a concept, meme, or device used in thinking, language, and intervention (reinforced by more fundamental foes that act as barriers to thinking).  They are not the only foes or friends, of course, but they are significant ones that either crop up regularly in discussions and writings about safety, or else – in the case of friends – should do.

In this post, I outline seven foes of explanation. To keep it short (because I usually intend that, but rarely succeed), I limit each explanation to an arbitrary limit of 100 words.

In this series:

  1. Seven foes of explanation in system safety (this post)
  2. Seven foes of intervention in system safety
  3. Seven friends of explanation in system safety
  4. Seven friends of intervention in system safety
130511488_00863896cb_o

Ken Douglas CC BY-NC-ND 2.0 https://flic.kr/p/cwUw5

1. Human-error-as-cause

‘Human error’ is a vague, ambiguous and poorly defined bucket concept that tends to combine psychological variables (such as intention and expectation) and outcome variables (unwanted, by someone). From a psychological perspective (concerning departures from one’s own intentions or expectations), the concept is less problematic, but focuses on the head to the expense of the world. As an explanation in a complex system, the concept it widely misused and abused, especially to infer causation. 

2. Causal chains

The idea of causal chains, including domino or ‘5 Whys’ approaches, implies a linear ordering of cause and effect. Causal chains force people to think of complex socio-technical systems as if they were ordered technical systems, with clear, linear cause-effect relationships between components. In reality, complex socio-technical systems are defined more by non-linearity, temporariness of influence, and emergence.

3. Root cause(s)

At face value, the idea of a root cause “that, if removed, prevents recurrence” is obviously nonsensical; why-because arguments can go on ad infinitum. The oft-used stopping point “that management can control” is convenient and easily abused.  The concept encourages the idea of a single root cause, ignoring causal loops, emergent, synergistic or holistic effects, and often even multiple, jointly necessary, contributory causes. It is, of course, an efficiency-thoroughness trade-off by the analyst, but hidden behind an illusion, that going ‘down and in’ will get you to the ‘real cause’, which is actually a social construction. 

4. Causes, generally

The way we think of ’causes’ in the analysis of complex work situations is often at odds with the conceptual and theoretical basis of causation. While the concept may seem unproblematic when it comes to physical cause-effect relationships, such as a hand pressing a button or pulling a lever, the same cannot be said for relationships involving less visible, less tangible system components. As one goes up and out into the system and context or environment, or – at a psychological level – down and in to the human mind, it more advisable to refer to interaction and influence.

5. Loss of situation(al) awareness/crew resource management

‘Situation(al) awareness’ is an aggressive concept that emerged from the pilot community, and subsequently human factors engineering, before taking on a life of its own, gobbling up more useful and theoretically valid concepts with long histories in psychological research, and which better define and specify the cognitive processes. CRM has a similar heritage. Both are often used counterfactually as a proxy for ‘human error’, individually or collectively. In the case of loss of SA, it refers to the ‘loss’ of awareness – of past, present, or future (!) – with implications for individuals and system safety.

6. Violations

The term ‘violation’ has an intensity of connotation and implication that – especially in the context of its more common uses – makes it inherently violent. It is one of a few terms in safety that tends to prejudge and label work behaviour without really understanding why work-as-done is not always in accordance with work-as-prescribed, and very often is not and cannot be completely so. Rather than truly understanding these differences, we tend to classify the violations. The term itself acts as a barrier to discussion and reporting of messy reality situations. 

7. Monolithic explanations, generally

Monolithic explanations act as proxies for real understanding, in the form of big ideas wrapped in simple labels. The labels are ill-defined and come in and out of fashion – poor/lack of safety culture, lack of CRM, human error, loss of situation awareness – but tend to give some reassurance and allow the problem to be passed on and ‘managed’, for instance via training and safety campaigns. Often, the same term in reverse may be used to ‘explain’ success, meaning that almost all wanted/unwanted outcomes are due to the same one thing, absent or present.


If you want to learn more about how complex systems fail, in the most concise and easy-to-digest form, read this by read Richard Cook.

About stevenshorrock

This blog is written by Steven Shorrock. I am interdisciplinary humanistic, systems and design practitioner interested in human work from multiple perspectives. My main interest is human and system behaviour, mostly in the context of safety-related organisations. I am a Chartered Ergonomist and Human Factors Specialist with the CIEHF and a Chartered Psychologist with the British Psychological Society. I currently work as a human factors and safety specialist in air traffic control in Europe. I am also Adjunct Associate Professor at University of the Sunshine Coast, Centre for Human Factors & Sociotechnical Systems, and Honorary Clinical Tutor at the University of Edinburgh. I blog in a personal capacity. Views expressed here are mine and not those of any affiliated organisation, unless stated otherwise. You can find me on twitter at @stevenshorrock or email contact[at]humanisticsystems[dot]com.
This entry was posted in Safety, systems thinking and tagged , , , , , , , , , , , . Bookmark the permalink.