System Safety: Seven Foes of Intervention

In this short series, I highlight seven foes and seven friends of system safety, both for explanation and intervention. Each is a concept, meme, or device used in thinking, language, and intervention (reinforced by more fundamental foes that act as barriers to thinking).  They are not the only foes or friends, of course, but they are significant ones that either crop up regularly in discussions and writings about safety, or else – in the case of friends – should do.

In this post, I outline seven foes of intervention. To keep it short (because I usually intend that, but rarely succeed), I limit each explanation to an arbitrary limit of 100 words.

In this series:

  1. Seven foes of explanation in system safety
  2. Seven foes of intervention in system safety (this post)
  3. Seven friends of explanation in system safety
  4. Seven friends of intervention in system safety
6559167161_566715d44f_b
Michael Coghlan CC BY-SA 2.0) https://flic.kr/p/aZBrSZ

1. Haste

When responding to an unwanted event, there is often an urge for urgency to choose a solution. This meets a need to reduce anxiety associated with uncertainty. It often results in premature choice of intervention, without properly understanding the problem situation(s), the system components, interactions and boundary, and the context, and without considering other possible interventions. Effort is then focused on implementation, bringing relief that something is in progress. The intervention itself may be built on false assumptions about the problem and the evolving system in which it exists or existed.

2. Overreaction

A single unwanted event (such as this example), set against perhaps tens of thousands of successes, can trigger a system-wide change that makes work harder for many stakeholders, and perhaps riskier. When overreaction and haste are combined, efficiency is favoured over thoroughness, and critical understanding is missing. Secondary problems are common, and may well be worse than the original one. Because risk assessments often have a component focus, the secondary problems are not foreseen. The result can involve large compensatory adjustments.

3. Component focus

System safety concerns interactions between micro, meso and macro components or elements of socio-technical systems – human, social, organisational, regulatory, political, technical, economic, procedural, informational, and temporal. Everything is connected to and influences something. The system is more than the sum of its parts, and does something that no component can do. But organisations are formalised around functions and silos, and interventions are often at the level of components, instead of interactions and flows. Acting on individual components blindsides organisations to the interactions between components, suboptimising the system by changing system-wide patterns, and creating unintended consequences elsewhere.

4. Over-proceduralisation

Work-as-prescribed – rules, procedures, regulations – is necessary to guide work-as-done and keep variation within acceptable limits. But work can rarely, if ever, be completely prescribed. Work-as-done takes work-as-prescribed as a framework for human work, adjusting and adapting to situations, drawing from and connecting disparate procedures, in a dynamic and creative way. But from afar, there can be a fantasy that work-as-done and work-as-prescribed are closer than is the reality, and nailing down more details and tightening regulatory requirements is a favoured intervention strategy. The result is more pressure and fewer degrees of freedom for necessary human performance adjustments.

5. Scapegoating

Blame – whether individual- or system-focused – is a natural human tendency following unwanted events or situations, in all aspects of life. Feeling or assigning some moral responsibility is natural and – in some cases – necessary. It is fundamental to the rule of law, especially to prevent or punish intentional affliction of harm. But scapegoating singles out and mistreats a person or group for unmerited blame. This relates to component focus, above, since one component is unfairly blamed. The result may satisfy outrage or displace responsibility, without solving a wider or deeper problem, leaving the system vulnerable to similar patterns of dysfunction – a moral and practical problem.

6. Never/zero thinking and targetology

Never/zero thinking and targetology involve conflating a measurement and a goal, (or anti-goal, in the case of accidents). With never/zero thinking, the implication is that there can be zero harm/zero accidents, while non-zero targets may refer to a maximum number of unwanted events in a given time frame, often with consequences for breaches. One intention is to motivate people to be safe and to avoid accidents. This misunderstands the nature of accidents, measurement, and human motivation. Unintended consequences tend to be hard to see from afar (e.g., under-reporting), resulting in blunt-end Ignorance and Fantasy, perhaps reinforced by green-light dashboards.

7. Campaigns

Organisational campaigns are a favoured top-down means of change, often triggered by new management. They are characteristic of the ‘done to’ and ‘done for’ modes of change and may concern, for example, safety culture, error management, team/TRM training, ‘hearts and minds’, behavioural safety, or high reliability organisations. This is often done via external training consultants. Unless the activity helps to understand work-as-done (including the messy reality) in the context of the system as a whole, the effects visibly wear off shortly after the campaign ends. Staff know this dynamic well; it has been done to/for them many times.


A note on intervention: The term intervene comes from the Latin intervenire, from inter- ‘between’ + venire ‘come’. To intentionally try to understand a situation, or take action to change it (e.g., improve it or prevent it from getting worse) is to intervene. While there may be no intention to change a situation while observing or measuring, that is very often an unintended consequence. 

If you want to learn more about how complex systems fail, in the most concise and easy-to-digest form, read this by read Richard Cook.

Author: stevenshorrock

This blog is written by Dr Steven Shorrock. I am interdisciplinary humanistic, systems and design practitioner interested in human work from multiple perspectives. My main interest is human and system behaviour, mostly in the context of safety-related organisations. I am a Chartered Ergonomist and Human Factors Specialist with the CIEHF and a Chartered Psychologist with the British Psychological Society. I currently work as a human factors and safety specialist in air traffic control in Europe. I am also Adjunct Associate Professor at University of the Sunshine Coast, Centre for Human Factors & Sociotechnical Systems. I blog in a personal capacity. Views expressed here are mine and not those of any affiliated organisation, unless stated otherwise. You can find me on twitter at @stevenshorrock or email contact[at]humanisticsystems[dot]com.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.